question

Ullas avatar image
Ullas asked

More records (table with one Varchar Max column) causes deadlock with windows services

Hi, I have a windows service for sending emails by reading email contents from a table including the body (Varchar Max with 40000byes of data). The windows service will update the status of each record after sending. My issue exists only of I have records more that say 7500 in the table. That is my windows service will work fine with all updates to table if the number of records in the table is less than 7500 records. If that number of records exceeds 7500 then it cause a deadlock in the table and my windows service will break. Please help with your suggestions...
sql-server-2008deadlockwindowsservices
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Kev Riley avatar image
Kev Riley answered
Sounds like your read (select) of the data, and subsequent update are in the same transaction, and that given enough load (over 7500 rows), the situation arises where a deadlock occurs, probably on page resources. Try and split out the 2 functions of reading and updating.
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Grant Fritchey avatar image
Grant Fritchey answered
Deadlocks usually occur because of a difference in the way data is accessed. You have more than one process involved (usually, unless we're talking parallelism, which is possible) and they each want to exclusively lock a resource that each of them already has a share lock on. The classic dead lock is for one process to read rows 1 and 2 and then take a lock on 1 and want a lock on 2. The other process in the mean time read rows 2 and 1 and then took a lock on 2 and wants a lock on 1. This is not a blocking situation since neither can clear until the other does. This is the classic deadlock, sometimes referred to as a deadly embrace. If you don't already have it enabled, turn on trace flag 1222. This will capture the deadlock information and store it in your error log. Then you can figure out which processes are involved in the deadlock and where it's occurring. That will enable you to figure out how to take Kev's advice and split up the code or rearrange it as needed.
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Blackhawk-17 avatar image
Blackhawk-17 answered
Other things to consider are whether your service can launch multiple runs that conflict with each other and whether you are using a cursor in your processing. @Grant Fritchey described deadlocks nicely. Are you seeing these reciprocal blocks or just normal blocking?
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Oleg avatar image
Oleg answered
Here is one way to possibly avoid the problem altogether. You can try changing the code in the windows service slightly so when the data from the data store is originally retrieved and then read and processed record by record, add the code to this processing to simply add a value to the collection of processed items (something like RecordID and Status) instead of updating the record at this time. Once the loop sending emails bails out, you can feed the collection to the stored procedure used to update all processed records with only one call. This is possible because you have SQL Server 2008. The main difference would be that instead of using 7501 calls (one to read the data plus 7500 updates), you will use only 2 calls, one to read and one to update all records at once. If you are using a data reader to read your data then this could be a direct cause of your trouble simply because the readers get to keep the connection open until you are done with them. This means that you have an open connection used to both read data (set based) and also update it (in the RBAR fashion). You can read about how to pass a bunch of records as a single parameter of a SqlDbType.Structured type to the procedure in my [answer to this question][1], I have a complete sample there. I recently had a presentation on using 2 different methods of achieving the same result, one to use the table valued parameter and another to simply using xml to do the same. Both scale very nicely. We use the xml method where I work because it is very simple and available in SQL Server 2005 (we still have mixed 2005/2008 enviroinment), so I know first hand that it scales well even though sometimes xml parameter goes well into hundreds of MBs. Despite the size, it is so much better to issue 1 simple update of a bunch of records from the join of xml and the destination table than to torture the latter by updating one record at a time. I did not push the table valued type to the limit yet, but I know that it scales well into tons of records and even beats XML performance by far. Please let me know if the info in the linked referenced above is not enough and I can add the guts of my presentation (which includes a complete sample consisting of a C# console app and T-SQL to generate sample data and objects) to this answer. Oleg [1]: http://ask.sqlservercentral.com/questions/7307/how-to-update-the-session-datatable-from-cnet-to-sqlserver-2008
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.