Even more impressive is the backup compression. Full backup file size was about 400GB and it took hours to complete. Backup size now: 44GB. Full backup time: 13 minutes. Unbelievable.
Regarding table compression, here is a query that I use to show tables compression status:
SELECT distinctFor our sandbox database, where business analysis do extensive "data smashing", I created this stored procedure to compress uncompressed tables:
OBJECT_SCHEMA_NAME(OBJECT_ID) as SchemaName,
OBJECT_NAME(OBJECT_ID) AS TableName,
data_compression_desc,
rows
FROM sys.partitions
WHERE OBJECT_SCHEMA_NAME(OBJECT_ID) not in ('sys')
ORDER BY SchemaName, TableName
-- This procedure compresses uncompressed tablesWarning: I did some typing here to make the code look nicer on the web page. I may have introduced a syntax error. So if you use it, give it a critical look-over first in the Management Studio.
-- that have more than 10,000 rows
crerate proc [dbo].[usp_compress_tables] as
-- get uncompressed tables
declare CUR cursor READ_ONLY FAST_FORWARD for
SELECT distinct
OBJECT_SCHEMA_NAME(OBJECT_ID) as SchemaName,
OBJECT_NAME(OBJECT_ID) AS TableName
FROM sys.partitions
WHERE
data_compression = 0 and rows > 10000
and OBJECT_SCHEMA_NAME(OBJECT_ID) not in ('sys')
declare @tbl sysname, @sch sysname, @cmd varchar(max)
-- loop through tables
open CUR
fetch next from CUR into @sch, @tbl
while @@FETCH_STATUS = 0 begin
-- compose alter tablerebuild with (data_compresion = page)
select @cmd = 'ALTER TABLE [' + @sch + '].[' + @tbl + '] ' +
'REBUILD WITH (DATA_COMPRESSION=PAGE)'
print @cmd
exec ( @cmd )
print 'Done.'
fetch next from CUR into @sch, @tbl
end
-- clean up
close CUR
deallocate CUR
GO
No comments:
Post a Comment