Friday, March 23, 2012

Log file size not reduce

Database log of my DB is around 2GB.
The database is using FULL recovery option.

I want to reduce the file size of the log cause it takes up a lot of
space.

I'd do a full database backup, then backup the transaction log as well
... both backup performed with a check on the option "clear inactive
entries from transaction log".

But after I backup, the database log is still 2GB.

What should I do to reduce the database log file size?
Should I use?:
==============================
Dump Tran databaseName with no_log
DBCC shrinkdatabase(databaseName, 30)
==============================

Is that safe to be used in production server?

Peter CCH[posted and mailed, please reply in news]

Peter CCH (petercch.wodoy@.gmail.com) writes:
> Database log of my DB is around 2GB.
> The database is using FULL recovery option.
> I want to reduce the file size of the log cause it takes up a lot of
> space.
> I'd do a full database backup, then backup the transaction log as well
> ... both backup performed with a check on the option "clear inactive
> entries from transaction log".
> But after I backup, the database log is still 2GB.
> What should I do to reduce the database log file size?
> Should I use?:
>==============================
> Dump Tran databaseName with no_log
> DBCC shrinkdatabase(databaseName, 30)
>==============================
> Is that safe to be used in production server?

First of alll, think twice before you start to shrink files at all.
If you know that your database have experienced a heavy once-in-a-blue-
moon update, then it could make sense to shrink the log. But if the
log is 2GB because of daily operations, the log will grow again. And
while the log is growing you lose performance on you server. And the
log file may be fragmented on file-system level.)

If you use NO_LOG, you must take a full backup, or else you will no
long have any up-to-the-point recovery option from that point and on.

DBCC SHRINKFILE is better to shrink the log file than DBCC SHRINKDATABASE
that will also operate on the data file.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||hi

you can use DBCC SHRINKFILE for this purpose

best Regards,
Chandra
http://groups.msn.com/SQLResource/
http://chanduas.blogspot.com/
------------

*** Sent via Developersdex http://www.developersdex.com ***|||I use the below statement. The concept of "it is just going to grow
again" is bogus. If you don't shrink it, it will eventually take up the
entire hard drive it is on (at least in our experience) and we have
experienced no performance problems with shrinking it every day to one
megabyte.

use DBTest
go

DBCC SHRINKFILE(DBTest_log, 1)
BACKUP LOG DBTest WITH TRUNCATE_ONLY
DBCC SHRINKFILE(DBTest_log, 1)
go|||> I use the below statement. The concept of "it is just going to grow
> again" is bogus. If you don't shrink it, it will eventually take up the
> entire hard drive it is on (at least in our experience) and we have
> experienced no performance problems with shrinking it every day to one
> megabyte.

This is very wrong and dangerous advice. Peter stated he is using FULL
RECOVERY so you just invalidated his log backups. Shrinking and growing
the log in this way can also put your system completely out of action.

If you need to shrink the log every day then someone isn't doing their
job properly.

Read:
http://www.karaszi.com/SQLServer/info_dont_shrink.asp

--
David Portas
SQL Server MVP
--|||pb648174 (google@.webpaul.net) writes:
> I use the below statement. The concept of "it is just going to grow
> again" is bogus. If you don't shrink it, it will eventually take up the
> entire hard drive it is on (at least in our experience) and we have
> experienced no performance problems with shrinking it every day to one
> megabyte.

As David said, if you need to shrink it daily, you have a problem that
you should fix. If the log keeps growing, the most likely reason can be
because you are running with full or bulk-logged recovery, but are not
taking regular log backups. (Backing up the database does not truncate the
log.)

> use DBTest
> go
> DBCC SHRINKFILE(DBTest_log, 1)
> BACKUP LOG DBTest WITH TRUNCATE_ONLY
> DBCC SHRINKFILE(DBTest_log, 1)

Well, I know a better way: if you don't care about up-to-the-point recovery,
and are content with restoring the latest backup in case of a crash, set
the recovery mode to simple, and SQL Server will regularly truncate the
transaction log.

If your business requirements do call for up-to-the-point recovery,
you should never use WITH TRUNCATE_ONLY unless there is real emergency.
And in such case, you should take a full backup directly.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||I haven't ever manually set any recovery mode setting, so is there a
default setting somewhere(I'm wondering why this is the default
behavior)? I have seen this same issue on many other databases and this
is the fix I have always used - We just use database backups for
diaster recovery and make them every two hours.|||pb648174 (google@.webpaul.net) writes:
> I haven't ever manually set any recovery mode setting, so is there a
> default setting somewhere

The default setting is inherited from the model database. And model is
shipped with the default setting of full recovery. (A vague recollections
tells me that MSDE may have it the other way round.)

>(I'm wondering why this is the default behavior)?

Because it assumed that you will take database and log backups to be
able to recover the database up to the last minute in case of a crash
or a severe manual mistake.

> I have seen this same issue on many other databases and this
> is the fix I have always used - We just use database backups for
> diaster recovery and make them every two hours.

If that is OK, set recovery mode to simple and for get about the
transaction log. Or start to take regular log backups.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||What about Database maintenance plan? is it helpful?

i mean if i configured it to tun everey night, and perform backup is T log
will shrinked ?

TNX

--
Message posted via http://www.sqlmonster.com|||Just out of curiosity, what point would there be in up to the minute
recovery? You will still need to make a point in time backup of the mdf
and ldf files, no? So why not just make a backup? Under what scenario
would you use the point in time restore?|||pb648174 (google@.webpaul.net) writes:
> Just out of curiosity, what point would there be in up to the minute
> recovery? You will still need to make a point in time backup of the mdf
> and ldf files, no? So why not just make a backup? Under what scenario
> would you use the point in time restore?

Say that you run a web shop. If you have a serious error, how much data
do you want to lose?

Say that you take full backsups every second hour. Last backup was 13:00.
At 14:53 you make intend a manual update from Query Analyzer to an order
which had some minor problem. But when you select the statement in the
query window you glitch, and the query is sent to SQL Server without the
WHERE clause. If you have simple recovery, you will have to restore that
backup from 13:00 and lose two hours of business. (And probably more,
because the customers whose orders disappeared this way may never come
back.)

With full recovery, you can in this situation take a backup of the
transaction log, and you can restore the database to the state it was
in at 14:52, and minimize your losses.

Another option for this fatal scenario is to have a log-reader tool like
Lumigent Log Explorer or LOG PI, which permits you rollback statements
selectively.

But for a scenario where the database becomes corrupt for an unknown
reason, they are less useful, and of course they can't help in case of
disk crash. In case of a disk crash you may not be able to backup the
last part of the log. However, some shops run transaction-log dumps
as often as every five minutes, or even by the minute.

And for many shops a full backup evervy second hour is not an option.
If your database is 100 GB, the backup takes time to run. And takes up
disk space.

But far from all business need all this. I recall a client whose database
proved to be corrupt. And what's worse, the most recent backup was also
corrupt. We had to go back over a week to find a good version. However,
this was not a major issue for them. They had their transaction log on
paper, so to speak, and could re-register all data that was lost.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Every minute of data lost costs money. That's why we have up to the
minute recovery from log backups.

How big are your databases? For a terabyte-scale database it isn't
practical to take backups every few minutes or even every hour.
Apparently your databases are small enough that it's feasible to take
2-hourly full backups. In that case use simple recovery.

--
David Portas
SQL Server MVP
--|||So if you are keeping a running transaction log and you get a command
that wipes out some data at 2:30, you can go back to how it was at
2:29? Can you post a command or a link on how to do that? I would like
to experiment with that.|||http://msdn.microsoft.com/library/d...backpc_5a61.asp
http://msdn.microsoft.com/library/d...backpc_6pv6.asp

--
David Portas
SQL Server MVP
--|||akej (forum@.SQLMonster.com) writes:
> What about Database maintenance plan? is it helpful?
> i mean if i configured it to tun everey night, and perform backup is T log
> will shrinked ?

If your database plan includes a backup of the transaction log, it
will trunctate the log. There should not really be any shrink, though.
Shrinking is only something you should do in special cases.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp

No comments:

Post a Comment