this sounds simple buy I cant seem to get a good steer on it. I have a database and some of the child tables might have around 1 million records added to it in a week. After about 2 or 3 days this data is redundant and can be removed but I would like to keep it for archiving and maybe reporting in the future.
The system is inactive 1 day a week and I want to run a backup on that day.
I want to take all the records more than 1 week old, move them to another database with their PK,s etc and then delete the from the current DB.
I was looking to see could I use a cumaltive backup in SQL using publish subscribe or should I write a Store proc and call it as a job every week. I was looking at the merge command which seems handy. Also it seems to delete this amount of data can be slow so is there a fast delete if I take this approach.
looking for best practice and guidance... Mick
asked Jun 07, 2010 at 06:15 AM in Default
You ought to get a few answers along the lines of making a data warehouse out of this and using SSIS, both of which I am not involved with enough to make a good answer.
The one thing I would say is that to get rid of the data I would copy the data you want to keep into a new table, drop the old table and then rename the new table to the old table name. If there are lots of constraints that make this messy then I would change the drop table step to be a truncate table and the re-insert the data from the new table back in and drop the new table. truncate will work much more quickly than delete
answered Jun 07, 2010 at 06:41 AM
I've had similar issues in the past, while working in the Conveyancing business. What I did was create an "Archive" database with essentially the same schema but without the Identity constraints and Foreign Key relationships on the various table.ID fields, and put together a program that worked roughly as follows:
Can you say "Distributed Transaction"?
The "Archivable" criteria were generally along the lines of:
(This time interval was, initially, 2 years, then 1 year, then 6 months. I don't think we took it down to three months, but it was an option...)
I then put together a archived case view in case people wanted / needed to look at the old data, and also rewrote the application that built the Reporting database (a nightly task...) to grab data from both "Live" & "Archive", timing it to run at a time when the Archive program wasn't running.Getting the data back from Archive to Live is left as an exercise for the reader. ;-)
answered Aug 17, 2010 at 09:28 AM