question

Cathy avatar image
Cathy asked

How may i speed up this query. It works as it is but is slow.

--previous 3 months data DECLARE @dStart datetime , @dEnd datetime set @dEnd = dateadd(month, datediff(month, -1, getdate()) - 1, -1) set @dStart = DATEADD(month, datediff(month, 0, getdate())-3, 0) With Call_CTE ([ID],[Customer],[month],[Device],[CallNo],[Date],[Email],[Phone],[Recived],[Priority],[CallType],[CallCategory],[Desc],[RespInSLA],[Fixed in SLA],[FixedFirstTime],[Row],[Lastjournal],[journaltype]) as ( select c2.CustID as [ID] ,c2.Company as [Customer] ,datename(month,(c1.closeddate)) as [month] ,c2.product as [Device] ,c2.CallID as [CallNo] ,c1.closeddate as [Date] ,c1.TrackerEMail as [Email] ,c2.PHONE1 as [Phone] ,c1.recvdDate as [Recived] ,(Case when c1.CallType='Monitoring Alert' then 'Alert' else c1.Priority end) as [Priority] ,c1.CallType as [CallType] ,c1.Category as [CallCategory] ,c1.CallDesc as [Desc] ,(Case when [SLAResponseMins]<=[RespMins] then 1 else 0 end) as [RespInSLA] ,(Case when [SLAFixMins] <= [FixMins] then 1 else 0 end) as [Fixed in SLA] ,(Case when j.JournalType = 'Change - Success' then 1 else 0 end) as [FixedFirstTime] ,ROW_NUMBER() over (PARTITION by c1.CallID order by c1.callid) as [Row] ,(Case when (select top 1 j.JournalType from dbo.Journal j inner join dbo.CallLog c2 on j.CallID = c2.CallID where c1.CallID = c2.CallID order by j.CallID, j.EntryDate + ' ' + j.EntryTime desc) in ('Internal','Third Party') then 'Internal log' else (select top 1 j.EntryText from dbo.Journal j inner join dbo.CallLog c2 on j.CallID = c2.CallID where c1.CallID = c2.CallID order by j.CallID, j.EntryDate + ' ' + j.EntryTime desc) end) as [Lastjournal] ,(select top 1 j.JournalType from dbo.Journal j inner join dbo.CallLog c2 on j.CallID = c2.CallID where c1.CallID = c2.CallID order by j.CallID, j.EntryDate + ' ' + j.EntryTime desc) as [journaltype] from dbo.CallLog c1 inner join dbo.Journal j on c1.CallID = j.CallID inner join dbo.subset c2 on c2.CallID = c1.CallID where (c1.closedDate) between @dStart and @dEnd and not(c2.CustID) is null and not(c1.recvdDate) is null ) select * from Call_CTE where row = 1;
viewsubquery
3 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Fatherjack avatar image Fatherjack ♦♦ commented ·
What makes you think it is slow? How long does it take to execute? How much faster do you want it to go? 10% faster? 100% faster? 1000% faster? How much do you want to spend on achieving this improvement? These are serious questions although we dont necessarily need all the answers here but you should consider them when thinking about tuning a query. can you post the query plan or the results of the messages tab when you run the query with SET STATISTICS IO ON please?
0 Likes 0 ·
Grant Fritchey avatar image Grant Fritchey ♦♦ commented ·
The site runs off of voting. For all answers that are helpful, please indicate this by clicking on the thumbs up next to those answers. If any of the answers lead to a solution, please indicate that by clicking on the check box next to the answer.
0 Likes 0 ·
Cathy avatar image Cathy commented ·
After looking at the query plan i am not sure i can really speed it up further. I just thought there may be a better way of writing it. Thank you for your help.
0 Likes 0 ·
Kev Riley avatar image
Kev Riley answered
From where I'm sitting this is all guesswork...... Is CallId the primary key for CallLog? If so the line ROW_NUMBER() over (PARTITION by c1.CallID order by c1.callid) as [Row] is going to return '1' for each and every row, meaning the final select select * from Call_CTE where row = 1; will return all rows. Running an aggregate windowed function for no point, is a waste. The subqueries that order by j.CallID, j.EntryDate + ' ' + j.EntryTime are going to consume a lot of processing, and it makes me think that j.EntryDate and j.EntryTime are not the right datatypes.
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Cathy avatar image
Cathy answered
Row number is defined as the query brings up duplicates and i cant use distinct as there are text fields in there that are not comparable. I had to join the date time to get the last journal. I will try changing the datatypes to date time using cast as see if that helps. As far as the execution time is concerned it doe not look to bad. Client Execution Time 10:25:08 10:24:53 Query Profile Statistics Number of INSERT, DELETE and UPDATE statements 0 0 0.0000 Rows affected by INSERT, DELETE, or UPDATE statements 0 0 0.0000 Number of SELECT statements 2 0 1.0000 Rows returned by SELECT statements 11 0 5.5000 Number of transactions 0 0 0.0000 Network Statistics Number of server roundtrips 9 3 6.0000 TDS packets sent from client 10 4 7.0000 TDS packets received from server 59 3 31.0000 Bytes sent from client 5044 4490 4767.0000 Bytes received from server 184675 518 92596.5000 Time Statistics Client processing time 1 1 1.0000 Total execution time 175 20 97.5000 Wait time on server replies 174 19 96.5000
3 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Cathy avatar image Cathy commented ·
[link text][1] [1]: /storage/temp/1170-last+journal.zip
0 Likes 0 ·
last journal.zip (11.6 KiB)
Cathy avatar image Cathy commented ·
I used cast and it made no difference to the execution time.
0 Likes 0 ·
Kev Riley avatar image Kev Riley ♦♦ commented ·
Looking at that execution plan, 40% of the time is on the sorts in the subqueries as I suspected - but if you can't remove them.... Instead of using row_number to remove the dupes, maybe re-write the bit that is causing the dupes - I presume the join to dbo.subset?
0 Likes 0 ·

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.