question

tbostrup avatar image
tbostrup asked

Crazy Perfomance issues in SQL Server 2014

We are seeing some crazy performance results in SQL Server 2014 while testing an update from SQL Server 2008R2. First, our UAT server shows some performance issues we didn't see on our Dev server (comparable configurations AFAIK). Second, one of those performance issues has to do with trigger performance. A Third Party tool is performing a prepared SQL (UPDATE schm.MyTable SET SomeIntValue = @P1 WHERE MyPK = @P2) for a number of rows in a medium small table (~500k rows). (Performance is linear.) - On SQL Server 2008 R2, the statements ran acceptably fast, the job would average probably around 7 minutes for an average set (~375k rows). - On SQL Server 2014, the same set takes 6 hrs. So a 50x performance degradation (down to about 15 rows/s). - Disabling the trigger reduces the time back to the original 2008 R2 range. - The trigger includes a test to determine whether the body should be executed. The column being updated is excluded, i.e. the main "body" of the trigger is not executed for the Update statement. When I allow the main body to execute for the trigger (remove the SomeOtherCol from IN list - see below), the performance actually IMPROVES about 5x over the SQL 2014 performance when it IS excluded (it goes up to about 80 rows/s). - If I run individual parameterized SQL statements (using sp_ExecuteSQL) in SQL Server 2014, I get the 80 rows/s performance. The trigger is based on a common template used in several of our other databases going back to 2006 or 2007, so it is not a case of new code being bad (it may be old code going bad, of course... :-)) Has anyone seen anything like this or have any suggestions? We are on SQL Server 2014 SP1 (no CU). I am trying to see if we can update to SP2, maybe with CU1. Trigger looks like this: SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO -- Trigger records timestamp for insert and latest update CREATE TRIGGER [schm].[tiu_MyTable] ON [schm].[MyTable] FOR INSERT, UPDATE AS DECLARE @Now DATETIME -- Exclude logging of updates that are result of INSERT trigger updating CreatedOn and UpdatedOn column: IF EXISTS ( SELECT 1 FROM INFORMATION_SCHEMA.COLUMNS Field WHERE TABLE_NAME = 'MyTable' AND TABLE_SCHEMA = 'schm' AND sys.fn_IsBitSetInBitmask(COLUMNS_UPDATED(), COLUMNPROPERTY(OBJECT_ID(TABLE_SCHEMA + '.' + TABLE_NAME), COLUMN_NAME, 'ColumnID')) <> 0 AND COLUMN_NAME NOT IN ('CreatedOn', 'SomeOtherCol', 'UpdatedOn') ) BEGIN SET @Now = GETDATE() -- Log UpdatedOn: UPDATE [schm].[MyTable] SET UpdatedOn = @Now FROM [schm].[MyTable] INNER JOIN Inserted AS I ON I.MyPK = [schm].[MyTable].MyPK INNER JOIN Deleted AS D ON D.MyPK = [schm].[MyTable].MyPK WHERE D.CreatedOn IS NOT NULL -- Log Insert (No matching entry in Deleted): UPDATE [schm].[MyTable] SET CreatedOn = @Now FROM [schm].[MyTable] INNER JOIN Inserted AS I ON I.MyPK = [schm].[MyTable].MyPK LEFT OUTER JOIN Deleted AS D ON D.MyPK = [schm].[MyTable].MyPK WHERE D.MyPK IS NULL AND I.CreatedOn IS NULL -- Log Delete - Can't capture a hard delete without saving elsewhere END
performanceupdatetriggersql server 2014
2 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Magnus Ahlkvist avatar image Magnus Ahlkvist commented ·
Based on your observations, it seems like the IF EXISTS is the "performance killer" here. Have you isolated that part of the trigger and tested only it? And did you check if you get different execution plans for that statement in SQL Server 2014 vs SQL Server 2008R2?
1 Like 1 ·
Magnus Ahlkvist avatar image Magnus Ahlkvist commented ·
As a side note: If more than one row is updated, and in only one of these rows, other columns than SomeOtherCol1 is updated, you'll still get the logging of all rows. Perhaps that's how you want it to be, or perhaps your application will not do that kind of Changes. But it's one of the most common mistakes in trigger logic - to logically treat the trigger as a row-by-row-operator, while in fact it's executed once for all rows affected by a DML operation. Look at this example: MERGE schm.myTable as t USING (SELECT * FROM schm.MyTable WHERE MyPK%2=0) as s ON t.myPK = s.myPK WHEN MATCHED THEN UPDATE SET SomeOtherCol1 = s.SomeCol WHEN NOT MATCHED BY SOURCE THEN UPDATE SET SomeCol='new value'; It will falsely set UpdatedOn even for the rows where only SomeOtherCol1 was updated.
1 Like 1 ·
ThomasRushton avatar image
ThomasRushton answered
No answers here, but a few thoughts... 1. Aaron Bertrand's blog post at http://sqlblog.com/blogs/aaron_bertrand/archive/2011/11/03/the-case-against-information-schema-views.aspx indicates that MS advises against using the INFORMATION_SCHEMA objects 2. If a trigger is basically just a block of SQL, then it's possible that it's subject to the vagaries of parameter sniffing on compilation. 3. Check your statistics. 4. Apply SPs and CUs as per MS's guidelines, as they often fix performance issues. From MS's website - https://support.microsoft.com/en-us/kb/3164674 (this is for CU1 for SQL2016, but the boilerplate is the same and has been since the start of this year) - : > Microsoft recommends ongoing, proactive installation of CUs as they become available: > SQL Server CUs are certified to the same levels as Service Packs, and should be installed at the same level of confidence. > Historical data shows that a significant number of support cases involve an issue that has already been addressed in a released CU. > CUs may contain added value over and above hotfixes. This includes supportability, manageability, and reliability updates. Good luck.
1 comment
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Kev Riley avatar image Kev Riley ♦♦ commented ·
All good suggestions. Might also be worth checking against old-CE too? https://msdn.microsoft.com/en-us/library/dn600374(v=sql.120).aspx
1 Like 1 ·
tbostrup avatar image
tbostrup answered
Thanks! ThomasRushton - Switching the logic from INFORMATION_SCHEMA to the catalog views gave us back our performance - or close to it. An improvement of 35x. We tested after updating statistics and rebuilding indexes. The database compatibility level was set to 120 (SQL 2014). This particular update is called for individual rows (Set SomeOtherCol = 1 WHERE MyPK = value). We'll look at upgrading version, but we have several other applications/systems on SQL Server 2014 SP1, and significant testing efforts would be required to get everyone updated at this time. It will be on our horizon, though. Updated logic: SELECT 1 FROM sys.columns AS c INNER JOIN sys.tables AS t ON t.object_id = c.object_id INNER JOIN sys.schemas AS s ON s.schema_id = t.schema_id WHERE t.name = 'MyTable' AND s.name = 'schm' AND sys.fn_IsBitSetInBitmask(COLUMNS_UPDATED(), COLUMNPROPERTY(OBJECT_ID(s.name + '.' + t.name), c.name, 'ColumnID')) <> 0 AND c.name NOT IN ('CreatedOn', 'SomeOtherCol', 'UpdatedOn')
1 comment
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

tbostrup avatar image tbostrup commented ·
SP2 made no difference compared to SP1 (with the modified trigger above). So we have approximately a 50% performance hit compared to SQL 2008 R2 (SQL 2014 SP1 50x worse -> Use catalog views instead of INFORMATIUON_SCHEMA 35x regained -> SP2 no difference -> 50% overall loss of performance).
0 Likes 0 ·

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.