Once again it is time for a Phil Factor Speed Phreak Challenge and the prize is now a $100 Amazon voucher, and the privilege of displaying this rather nice trophy jpg on your blog / website / bedroom wall.
This time your task is simply to produce a summary report of the cost of international phone calls made within an organization on a week by week basis for both Users and Offices. You are allowed to use any method you see fit, you may also add any indexes, table-functions or views that you wish (though not an index view). Creation of these will not count to the overall execution time. If you are unsure if what you want to do might disqualify you, then please post a comment.
The table CallLog contains the log of every phone call, including which user called which number, when the call started and ended, and the office the user was in at the time. You will notice that it is not well normalized, since it is actually a view taken from several tables. Users are never in a fixed office, and can move from office to office at any point. Calls with a CallEnd equal to CallStart were not answered and can be safely ignored.
To calculate the cost of the call you need to use a lookup within the PhoneTariff table. The calls are prefixed by an area code corresponding to a country.
You must note that many of the call areas have the same starting character sequence , so calls made to numbers starting '35191' must be priced using the tariff of '35191' not '351'.
There is a rather elaborate charging system according to the length of the call. This means that you need to calculate the cost of the call as the minute by minute cost changes with the length of the call. These must be summed.
Looking at the PhoneTariffCharges table
The first 8 minutes of the call will be 0.4792 per minute. 9 to 31 minutes will be 0.18 per minute 32 to 59 minutes will be 0.5702 per minute etc.... All ranges have a UpToXMinutes of 9999 so you dont need to worry about an upper limit. The calllength is rounded upto the nearest whole minute.
Here is the DDL to create the tables
and heres the link to the data. Use
to load the data in.
Heres the solution provided by our mediocre developer Robert Bar, please note a fix for an issue with the week number being taken from @CallEnd.
Here are some guidelines for your entries:
1) Include a header in your suggestion. Make sure your name and the current date is present.
2) Include an edition number. First edition is 1. If you later improve your current suggestion post it again as version 2. Example: “Peso 1” and if improved, “Peso 1b”, “Peso 1c” etc.
3) If you are trying a new algorithm, change the edition to “Peso 2”. If you improve this algorithm, change the version to “Peso 2b”, “Peso 2c” etc. This will save Phil hours of work in the test harness!
4) The solution must clear up all its mess (temporary tables, indexes, etc.) so it can be re-run without errors.
As ever Phil Factor will be final judge.
The closing date will be midnight Thursday 17th December London
Ok, here's my first go (now revised to align with question change - week from startdate):
This runs in between 10 and 15 seconds on my box, as compared to 35 minutes 25 secs for the Robert Bar (bless him) solution.
If stored procs aren't allowed, let me know and I'll change it to bare definition, it doesn't make a massive difference to the timings anyway...
Initial Timings are pretty close ,
Just for comparison - timings from Matt W's machine:
Note that matt1 and lmu921 didn't run because I changed the collation in my test db and they both failed with collation conflicts. I didn't feel the need to alter my SQL entry! :)
Here's my second try. (lmu92 1b 20091216) Based on my previous version with the following changes: a) I don't create the result tables anymore, just printing the result sets with the SELECT statement (seems common practice throughout the solutions provided so far) b) replaced a UNION with a faster solution c) Added NOLOCK hints d) Building the intermediate table one step later, saving one update
prepare base tables (create index)
On my machine the code runs in the range of Pesos version 4a. It's still not as fast as Matts CLR though (at least on my machine)...
answered Dec 16, 2009 at 06:04 PM
I opted to go no hardwiring at all, because you never know when CallArea changes and suddenly has 6 digits, or more...
First, the Setup part
And then the Teardown part
And finally the first version of the query
Phil Factor 1b (1a got mangled by the website because it has < in the code)
This is my first entry, just to show a fairly conventional way of doing it. The call table is redone in a slightly more compute-friendly form. The same is done for the PhoneTariff table. Then the international call prefixes are identified by updating the table progressively, starting with the longest codes first and, at the same time, the PhoneTarrif and the initial connection charge is identified.
Then, the calls are costed out, a charge-band at a time until the longest calls have been costed out.
Once this has been done, then it is a simple matter of aggregating the reports.
I've left my timing harness in place in case you want to tweak the solution, or if you want to see how I generally do it.
On my server, I get times in the 7.5 sec range, for the whole operation. Last time I checked the result was the same as Robert Barr's (bless him)