this sounds simple buy I cant seem to get a good steer on it. I have a database and some of the child tables might have around 1 million records added to it in a week. After about 2 or 3 days this data is redundant and can be removed but I would like to keep it for archiving and maybe reporting in the future.
The system is inactive 1 day a week and I want to run a backup on that day.
I want to take all the records more than 1 week old, move them to another database with their PK,s etc and then delete the from the current DB.
I was looking to see could I use a cumaltive backup in SQL using publish subscribe or should I write a Store proc and call it as a job every week. I was looking at the merge command which seems handy. Also it seems to delete this amount of data can be slow so is there a fast delete if I take this approach.
looking for best practice and guidance... Mick