question

clayp56 avatar image
clayp56 asked

What is the best data type to use for large text?

I have a table (work_order) that has a column (long_description) with a data type of text. We have found the largest value stored in the column is 818 meg. When the application that accesses the data queries the table for the record that contains the large text in the column, the response time is extremely slow. We tried changing the data type to varchar(max) and turning ON the 'large value types out of row' option and the response time is still slow. Need help to determine how to allow for the storage of the large text and improve response time. Any suggestions would be greatly appreciated. Thanks!
varchar
2 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

clayp56 avatar image clayp56 commented ·
Thanks for the replies. I was wondering how to approach this using the suggestions. Should we try compression first? Would that be page or row compression? Also, what is the method used to store a pointer to data stored on external files? thanks,
0 Likes 0 ·
Pavel Pawlowski avatar image Pavel Pawlowski commented ·
Compression will help you to with speed of retrieval of data from IO subsystem by the SQL Server, however will not help you with the network transport layer. The Bigger slowdown in your scenario is for sure the network.
0 Likes 0 ·
TimothyAWiseman avatar image
TimothyAWiseman answered
Varchar(max) is most certainlyt he best data type to store large text data in the newer versions of SQL Server. The first question I would ask is if you truly need to store that much text in the database. For truly large amounts, you could store just a pointer to external files and then call up the files, perhaps even on another server so it is not tying up the database. Aside from that, I would look at traditional optimization techniques. Is your code as effecient as possible? Tied in to that, are you avoiding RBAR operations (in other words, are you using set based code)? Do you have indexes that support your normal query patterns? And while I tend to look at the hardware last, upgrading the hardware sometimes makes sense especially if it is notably out of date or if the demands have grown substantially since the original provisioning was done.
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Cyborg avatar image
Cyborg answered
Adding to @TimothyAWiseman answer, you can consider data compression, this helps you both in terms of performance and data storage, but this is an enterprise feature available in SQL Server 2008 and above.
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Pavel Pawlowski avatar image
Pavel Pawlowski answered
If you query the table and do not need the large value immediately (eg. providing list on the client side), then query the rows without the BIG column. Then if you need the big value on client side, you can query it separately. The issue with response is, that once you have the BIG column in your query and client is fetching the row containing such big data, all the data are being read by the client and it takes long. Other possibility is to fetch only a few first characters for preview using eg. LEFT(BIGCOLUMN, 80) and if more data is necessary then query it separately at the moment when they are needed.
1 comment
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Pavel Pawlowski avatar image Pavel Pawlowski commented ·
Only to add. If you have 818 MB of data in a column, than even on 1 Gbps connection to the server the maximum theoretical speed is 128 MB/s. So it will take 8 seconds to retrieve only data for that column. But this is really maximum theoretical speed which you will never receive on 1 Gbps connection.
1 Like 1 ·
Fatherjack avatar image
Fatherjack answered
Others have touched on it but just to make it clear, changing the datatype wont affect the speed of the query - you are asking for data and that amount of data will have to be shipped from the disk on the server to the client making the request. Whether it is VARCHAR(8000)/NVARCHAR(MAX) or what ever, 818MB is still 818MB. Compression will speed up getting the data off the disk but that is only helping disk access, the uncompressed data will have to travel over your network and obey the physical limitations that apply there. to speed up the query, request less data. Big columns in your database are making your backups slower and taking up space on your tape. if they are actual files then certainly shifting that content to a file server and simply using the database for storing the metadata of where the file stored will improve things in many ways.
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.