We have a 1gig log file that is filling up in about 20 minutes. We really don't have your typical DBA's. I am a developer and I am trying to figure out why it is filling up so fast.
The last thing I have done to try and see where the problem is to run a trace file. I 'think' I was able to have the trace on when the log file filled up again. However, I'm not really sure what to look for or what is causing this problem. Can someone please help?
I'm running Microsoft SQL Server 2000 - 8.00.818 (Intel X86) May 31 2003 16:08:15 Copyright (c) 1988-2003 Microsoft Corporation Enterprise Edition on Windows NT 5.0 (Build 2195: Service Pack 4)
We have lots of apps going against the DB (front-end, batch processes, etc). However, the acting 'DBA' says we have many more DB's that are much larger, with the same size log file and just as many transactions and the problem does not happen with them.
I don't know if this helps but it always seems to be the same person that encounters the log file full error. When I look at the trace file and I can see the 'rollback' next to her id - I'm assuming that's when she got the error. So, I figured I would look at the last insert or update that happened before the error. But, I don't know if it was the last transaction that caused the error or was the one that 'put it over the top' as far as filling up. This is a big problem for our users as it happens about twice a day.
i think that may due to:bulk deletion /insertion or update as a one transaction with commiting once at end.
Let developer divide that transaction to small transaction and commit or rollback in case of error.
That is done using begin transaction.
review the SQL commands and how transaction is handled.|||
It also makes a difference whether your database is in simple or full recovery model.
With full recovery, log space will not be reused until a log backup is performed.
With simple recovery model, log space is reused when all transactions with records in that area of the log have either committed or rolled back.
|||You can switch your recovery model from full to bulk-insert when importing tons of data. Then of course put it back and don't foget a full DB backup to protect the data in case you need to recover the database.
No comments:
Post a Comment