Last post Dec 11, 2014 07:31 AM by PatriceSc
Dec 09, 2014 10:36 AM|strangermike|LINK
We have a table that contained millions of un-needed records. Those records were deleted. The database size shows that it is in excess of 10G.
I created a new table with the same table design as the one where all the records were deleted from. I copied the old table to the new table.
I then dropped the old table. I thought I would have noticed a huge decrease in database size but that did not happen.
Is there something else I need to do, to release the phantom records created from the delete?
Or when I did the copy old to new, did the deleted phantom records copy along with the actual records?
Dec 09, 2014 10:44 AM|AZMatt|LINK
I recommend you first do a full backup, then right-click the database in SSMS and select Tasks, Shrink, Database. This should pull up a dialog that tells you how much space you can recover.
Dec 09, 2014 10:45 AM|strangermike|LINK
Currently trying the Shrink database command as suggested in a previous post. Will report findings.
Dec 09, 2014 10:47 AM|strangermike|LINK
Thanks Matt, this is what I am trying. bbcompent1 also suggested the same.
Dec 09, 2014 10:48 AM|PatriceSc|LINK
See http://msdn.microsoft.com/en-us/library/ms189035.aspx for how to shrink a database. Basically data needs to be moved at the beginning of the file and then only the last part of the file
can be truncated. So you have a particular high level operation that allows to do that...
Dec 09, 2014 11:23 AM|strangermike|LINK
So while running the shrink I got this error:
Shrink Failed for Database...
Lock request time out period exceeded. Cannot Shrink log file 2 because all logical files are in use. Error 1222
Does this mean someone was accessing database at the time?
Dec 09, 2014 11:43 AM|limno|LINK
Before yo shrink your database, yo need to know a few things. What is your database recover mode (FULL or Simple)? The 10 G file size is the size of mdf file or .ldf file? Do you have a backup plan in place? Answer these question before you start to use
Dec 09, 2014 11:56 AM|strangermike|LINK
The recover mode I think would be full, as we take a full backup weekly and if we had to restore we would restore from that.
The 10 gig file is the size of the .BAK file.
As you can see we are kind of feeling our way through this. Coming from Sql (mainframe...haha) dropping the table released the space back.
Is Shrink the right command for us to release the space back?
Dec 09, 2014 12:10 PM|limno|LINK
If your database is in FULL recovery mode, the first thing you should now is to backup the log file. Without the management of log file backup, your database will grow to a point that it will run out of space someday and all application that are using this
database will stop. You still need to fix your database size by setting up a healthy log backup routine. As a one time deal, after you backup your log file, you can shrink to your database files to appropriate size and maintain them in that way and you will
have backup files for both full and log. Your data are safe to some extent and you have a control for your database size. Hope you get a sense for LOG backup. It is an important concept that grasp.
Dec 09, 2014 02:06 PM|strangermike|LINK
Limno I believe I miss spoke.
Our recovery model would be Simple, (just using the .Bak file to recover from.)
Any loss of data would between full database backups would be acceptable as if needed it can be rebuilt.
So with that, how can we Shrink DB?
Dec 09, 2014 02:36 PM|limno|LINK
declare @logicalName_Data varchar(50)
Select @logicalName_Data=name FROM sys.master_files
WHERE database_id = db_id('BaselineData') AND type = 0
DBCC SHRINKFILE (@logicalName_Data,250) --250M
Dec 09, 2014 02:43 PM|limno|LINK
If you are intested, you can find more information from this link:
You can increase your log file size with Alter database command:
ALTER DATABASE [BaselineData]
(NAME = BaselineData_Log,
SIZE = 600MB);
Dec 09, 2014 02:58 PM|strangermike|LINK
This looks dangerous! This is the only way to get rid of shadow records that were deleted?
I will take a look.
Dec 09, 2014 03:50 PM|limno|LINK
You should always have a copy of your most recent backup before you execute stuff like this in a database level. You are on your own now.
Take it easy. You can create a sample from the backup with a different name and on a different test server or your own machine and play the script you find online to test them out.
I agree with you that it looks dangerous as you would drive on highway. Take a backup and verify the copy of the backup is good.
Dec 11, 2014 07:31 AM|PatriceSc|LINK
Shrinks needs to move data at the beginning of a file and then only you can get rid of the end of the file as it doesn't contain valuable data. The point is not that delete doesn't freed space but if the available space is not at the end of the file you
can't reduce the size file without doing this reorganization.
This is why this is db level operation. It's likely worth if you delete so much data that you don't expect to fill again the same space any time soon. It is useless if you'll "soon" insert enough data to reuse the space that has been freed.