Wednesday, March 28, 2012

Log getting too big

Hello,

I have a local database that is refreshed overnight. A DTS batch grab several views from another database and makes local copy locally. This process creates a hige amount of unnecessary lines in the log file and sometimes so much that the entire available disk space is used, thus crashing the DTS batch. This database is mainly static and non transactional. I wonder if its possible to disable alltogether the logging process for that particular db but I know its part of SQL server so it can't be that simple.

If I limit the size of the log file it gets cluttered and the same problem that if disk space was all used occurs... what should I do ?Not possible to skip the log
But possible to clean it (via backup, or via sp_dboption YourDB, trunc, true, ...)|||You can set the recovery model of the database to simple|||Originally posted by Enigma
You can set the recovery model of the database to simple

... and keep DTS transactions as short as possible - every checkpoint will clean up the transaction log.|||Model recovery already set to simple...

added a SQL task at the end of my batch that does this :

backup log someDatabase with truncate_only;
dbcc shrinkfile (someDatabase_log);

but still my log is several gigabytes big ... maybe Ill add that SQL task after each import tasks...

*sigh*|||Simple-recovery model logs each transaction and clears the log at each CHECKPOINT that's why during ur DTS it gets too big. Try the Bulk-Log recovery model, for it just skips BCPs and SELECT INTO etc. During ur DTS the log may fill-up less than the Simple-Recovery Model. Bulk-Log requires Transaction-Log backups to clear older records.|||I'll give a try at your clever solution... makes sense to me... Ill post a follow up monday if everything went ok (or not). For the past 4 weeks my batch systematically crashed on week-ends.

Thank you all for your time and patience!

Have a good w-e (6pm in europe already...)|||Anyone want to scream bcp?

No comments:

Post a Comment