This is my first time working with SQL and I'm trying to write a program that searches a SQL database many many times very quickly. The database I have made will not change and is 7 columns wide and 2.5 million rows deep (though I want to make one that is 200 million rows). Column 1 is a unique 10 digit number and 2-7 are 2 digit numbers. I believe I have column 1 indexed as PRIMARY, id, and id_2. I need to be able to search my current database a couple hundred million times in about 1 second, is this possible with SQL? I am using Java to write my program and I'm using PreparedStatement with query = "SELECT rank0, rank1, rank2, rank3, rank4, rank5 FROM ranks WHERE deckForm = ?". It currently takes about 90 seconds on my machine using xampp/phpMyAdmin/i5/6gbram to run one hundred million selects. Also, it took me about 17 hours to create the database using a program I wrote. Should there be a way to make it more efficient? It would take a very long time to create the 200 million row table. Is importing from CSV faster? I tried that with a small sample of my data and it said it timed out after 5 minutes. Thanks
100 million selects in one second is very extreme for SQL Server. I imagine you'd need to have it on server with a powerful processor and have the database on a solid state disk. Seeing as this seems to be a very specific application, and the data won't change, have you considered loading the CSV file directly into an array (sorted) in RAM and writing your own search on the array?