question

bimini07 avatar image
bimini07 asked

MIN and MAX with values with a twist

Hi, I need to get the MIN and MAX dates for volume but I need to group it based on volume and not all the volume of same amount.... Basically, I have daily volume and dates for those daily volume. I need to be able to get the MIN Date as "to" and MAX date as "from" for a set of volume. Note that the volume can traverse dates and then break and then have a new set of dates for the same volume. Hopefully the screenshots below do a better job explaining than I can. I know how to do this via code.. but was wondering if the same was possible with SQL. Please note that the SQL will be called from within an application and I can't insert into a temp table to get my end result data set... Here is the raw data from the table that I am querying: ![alt text][1] Here is what I ultimately want: ![alt text][2] And here is what I am getting with my currenty SQL query: [code="sql"] SELECT daily_volume, MIN(volume_date) AS min_date, MAX(volume_date) AS max_date, ins_num FROM daily_volume WHERE ins_num = 3854439 GROUP BY daily_volume, ins_num [/code] Result from my SQL Query:
202 11/30/2015 11/30/2015 3854439 900 11/28/2015 11/28/2015 3854439 1100 11/24/2015 12/3/2015 3854439 1200 11/27/2015 11/27/2015 3854439 1300 11/26/2015 11/26/2015 3854439 SQL Code to replicate table: -- Drop Table if it exists IF OBJECT_ID('TempDB..#mytable','U') IS NOT NULL DROP TABLE #mytable -- Create Table SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE #mytable ( [ins_num] [int] NULL, [param_seq_num] [tinyint] NULL, [volume_date] [datetime] NULL, [daily_volume] [float] NULL, [monthly_priced_volume] [float] NULL ) ON [PRIMARY] -- Insert Data into Table -- == Insert Data into Table INSERT INTO #mytable (ins_num, param_seq_num, volume_date, daily_volume, monthly_priced_volume) SELECT '3854439','1','Nov 24 2015 12:00AM','1100','0', '3854439','1','Nov 25 2015 12:00AM','1100','0', '3854439','1','Nov 26 2015 12:00AM','1300','0', '3854439','1','Nov 27 2015 12:00AM','1200','0', '3854439','1','Nov 28 2015 12:00AM','900','0', '3854439','1','Nov 29 2015 12:00AM','1100','0', '3854439','1','Nov 30 2015 12:00AM','202','0', '3854439','1','Dec 1 2015 12:00AM','1100','0', '3854439','1','Dec 2 2015 12:00AM','1100','0', '3854439','1','Dec 3 2015 12:00AM','1100','0' [1]: /storage/temp/3056-rawdata.jpg [2]: /storage/temp/3057-endresult.jpg
sql-server-2008querysql server
rawdata.jpg (71.3 KiB)
endresult.jpg (35.6 KiB)
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

1 Answer

·
Kev Riley avatar image
Kev Riley answered
Variation of the 'Gaps and Islands problem'. Once you've worked out a way of grouping the consecutive volumes, then it's easy :) I chose to use a combination of a row_number() over the volume_dates, partitioned by the volumes, and a row_number() over the entire set of volume_dates. select min(daily_volume) as qty, min(volume_date) as [from], max(volume_date) as [to] from ( select volume_date, daily_volume, daily_volume - (row_number()over(partition by daily_volume order by volume_date ) - row_number()over(order by volume_date )) as GroupID from #mytable ) grouping_subquery group by groupID order by min(volume_date)
3 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Kev, You are a genius! This worked flawlessly. I can't even come close to understanding what you did above but I'm going to start reading up on Gaps and Islands problems to make some sense. Thanks!
1 Like 1 ·
Itzik Ben-Gan is my go-to resource for this - you can get his chapter from MVP Deep Dives, on this for free from https://www.manning.com/books/sql-server-mvp-deep-dives
0 Likes 0 ·
And to understand better how it works, break out those columns from the subquery and look at the values
0 Likes 0 ·

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.