Hi Experts, I have a database which is growing rapidly and will reach 100GB shortly. I would like to know whether SQL Server 2008 will handle more than 100GB databases? What are the best practices to handle such big databases especially when my indexes are not well maintained. Thanks in advance!
I had a SQL Server 7.0 system with over 700gb of data that was collecting a gb of data a day and that was 15 years ago. Yes, SQL Server 2008 can certainly handle 100gb of data and lots more. You just have to make sure you maintain your statistics properly, make sure you have enough space to get good backups in place, and as @Dave_Green says, take into account that it's going to need more memory, CPU and disk. People are running multi-terrabyte systems on SQL Server. The largest in the world is well over 100tb and working fine according to the people running it. You just have to do the necessary work to ensure it's performance and maintenance.
On a purely limitations basis, [SQL Server 2008's capacity limitations are specified here]. However, there are a number of other considerations, such as memory and CPU (and the way which the database is used - % of records that are queried, written to, etc) that will impact how well a server runs as it's databases grow in size. :