I'm currently implementing a system which logs exceptions, and one of the features I'm putting in to it is a daily report of 'these exceptions have occurred since the last report'. For this I will be tracking the last reported on IDENTITY value in the source table, and reporting on the rows added since. But I thought it would be interesting to find out how you would track such activity - would you:
I'm not advocating any of the above, simply putting down the first few tracking methods that popped into my head. But I'd be very interested to hear how you guys would approach this sort of thing.
Given that there can't really be a 'correct' answer here, I'll mark correct the answer with most votes after a couple of weeks. :)
asked Dec 19 '09 at 05:23 PM in Default
Matt Whitfield ♦♦
I sugest to create a table, which can act as a log of the report. You can store timestamp when the report was generated and ID of the last exception covered by the report.
Using this table, you can at any time extract info about what exceptions were covered by which report.
answered Dec 19 '09 at 07:12 PM
I think I would go for Pavels recommendation for four reasons:
Why? If you look at your options:
You will not be able to trace when the report was generated and what if you would scale your application into a webfarm?
You will not have any traceability here either and what will happen if you rebuild you table with a tool like Red-gate comare and ignore extended properties.
This is probably the least recommended solution because of the performance. You add a column with only two options seen/not seen and you need to update the table when the report are generated. An index on this column will not be selective enough to bee considered by SQL server AND your index will be fragmented. Both of this issues will cause the index or table to be scanned.
On the other hand, if you add a tracker table, you probably will store the key of the table and maybe your key is clustered, giving SQL the oppertunity to use a clustered index seek.
And with a tracker table you will be able to add a login name or something like that to make the report personal. Every user reading the report will see events from the last time they looked at the report.
While you can set up a mechanism that tracks the "last time the report was run" what if it gets run twice in a day and then you run it tomorrow. You'll only see a partial day's worth of data. Generally when doing something like this, I like to work off date & time data and use a formula, published, so everyone knows what to expect from the report. So, for instance, by default it shows the last 24 or 48 hours, or by default it shows the previous two days. Both are easy calculations to put into the TSQL code and then you always know what is being presented.
Unless the report can only be generated through an automated process, simply recording "since the last time" could cause misleading communication. However, if you can only ever access a canned report, the suggestions already listed are good methods.
answered Dec 21 '09 at 10:52 AM
Grant Fritchey ♦♦
For the precise question asked, I rather like Pavel's answer.
Not too long ago I had a similar problem given to me where they wanted a report of every row that changed in a particular way in a table each day. The approach I used there was to add a trigger that logged all changes to the table and included the time of the change. Then I had a procedure that ran once a day to gather those rows, format it, and used xp_sendmail to e-mail it to the appropriate procedure.
I considered not using an after-trigger and instead forcing all changes to that table to go through a procedure that would log it, but that would have involved changing code that had been in production on that database since before I joined that team, so the trigger made much more sense in the overall situation.
answered Dec 21 '09 at 02:06 PM