SQL Server Compression 101

The data compression feature in SQL Server can help reduce the size of the database as well as improve the performance of intensive I/O intensive workloads, especially on data warehouses. This performance boost in I/O is offset against the extra CPU resource that is required to compress/decompress the data whilst data is exchanged with the application. With data warehouses, the guideline is to compress all objects as there is typically CPU capacity whilst the data storage and memory capacity is at a premium, so Microsoft’s recommendation is to compress all objects to the highest level. However, if there are no compressed objects at all, it is better to take the cautious approach and to evaluate each database object individually and the effects on the workload, particularly if the CPU headroom is limited. Continue reading “SQL Server Compression 101”

PowerShell Snippet #7: Retrieve SQL Server Licence Key

Copied from somewhere else on the internet, this PowerShell script will return the product key used for a SQL instance Install. Super useful when changing licenses on temporary VM’s I spin up and play around with to SQL Developer whose instances have passed the Enterprise evaluation use-by date. Putting this here for my own benefit. I claim no kudos!

 


function GetSqlServerProductKey {
## function to retrieve the license key of a SQL 2014 Server.
param ($targets = ".")
$hklm = 2147483650
$regPath = "SOFTWARE\Microsoft\Microsoft SQL Server\120\Tools\Setup"
$regValue1 = "DigitalProductId"
$regValue2 = "PatchLevel"
$regValue3 = "Edition"
Foreach ($target in $targets) {
$productKey = $null
$win32os = $null
$wmi = [WMIClass]"\\$target\root\default:stdRegProv"
$data = $wmi.GetBinaryValue($hklm,$regPath,$regValue1)
[string]$SQLver = $wmi.GetstringValue($hklm,$regPath,$regValue2).svalue
[string]$SQLedition = $wmi.GetstringValue($hklm,$regPath,$regValue3).svalue
$binArray = ($data.uValue)[0..16]
$charsArray = "B","C","D","F","G","H","J","K","M","P","Q","R","T","V","W","X","Y","2","3","4","6","7","8","9"
## decrypt base24 encoded binary data
For ($i = 24; $i -ge 0; $i--) {
$k = 0
For ($j = 14; $j -ge 0; $j--) {
$k = $k * 256 -bxor $binArray[$j]
$binArray[$j] = [math]::truncate($k / 24)
$k = $k % 24
}
$productKey = $charsArray[$k] + $productKey
If (($i % 5 -eq 0) -and ($i -ne 0)) {
$productKey = "-" + $productKey
}
}
$win32os = Get-WmiObject Win32_OperatingSystem -computer $target
$obj = New-Object Object
$obj | Add-Member Noteproperty Computer -value $target
$obj | Add-Member Noteproperty OSCaption -value $win32os.Caption
$obj | Add-Member Noteproperty OSArch -value $win32os.OSArchitecture
$obj | Add-Member Noteproperty SQLver -value $SQLver
$obj | Add-Member Noteproperty SQLedition -value $SQLedition
$obj | Add-Member Noteproperty ProductKey -value $productkey
$obj
}
}


GetSqlServerProductKey

Turbo LogShip 1.0 Released

Earlier this year, I wrote a long post about log shipping. One of the key points I mentioned was how slow restoring can be for a read only log shipped database. If going and reading the whole thing is too much effort for you now, I’ll save you the effort (you child of the internet you) and tell you it’s because the database needs to be kept transactionally consistent in between restores when making it available for read-only. It creates a .tuf file (transaction undo file) to keep the progress of all the pages that have uncommitted transactions saved to them. The checking, creation, applying and re-applying of this can take some time where there is a significant amount of uncommitted transactions within a log backup. Continue reading “Turbo LogShip 1.0 Released”

SQL Server Database File Size and Free Space Revisited

I recently had a comment posted to one of my first articles I wrote way back in 2012 concerning adding a column that detailed the used space as a %age. So without any further ado, here is an updated version of a script originally blogged about here.

select
        sf.FILEID AS [File ID],
        [File Size in MB] = convert(decimal(12,2),round(sf.size/128.000,2)),
        [Max Size in MB] = convert(decimal(12,2),round(sf.maxsize/128.000,2)),
        [Space Used in MB] = convert(decimal(12,2),round(fileproperty(sf.name,'SpaceUsed')/128.000,2)),
        [Free Space in MB] = convert(decimal(12,2),round((sf.size-fileproperty(sf.name,'SpaceUsed'))/128.000,2)) ,
        [Free Space in %] = convert(decimal(12,2),round(100*(sf.size-fileproperty(sf.name,'SpaceUsed'))/128.000,2)/(sf.size/128.000)) ,
	    [Used Space in %] = convert(decimal(12,2),round(100*(fileproperty(sf.name,'SpaceUsed'))/128.000,2)/(sf.size/128.000)) ,
        [File Name] = left(sf.NAME,30),
        [File Location] = left(sf.FILENAME,100)
        from dbo.sysfiles sf
        order by fileid asc

Log Shipping: It’s Better Than Bad It’s Good!

better than bad its good

This is something of an epic post, so grab a tea and get ready…I’m probably showing my age by quoting an old Ren and Stimpy cartoon here, but to be fair it probably sums up log shipping pretty well. This post is focusing on using a read-only log shipping database for reporting purposes, and the limitations of read-only log shipped databases. I also share some monitoring scripts and a few ideas on how to improve restore performance without having to upgrade the hardware/software.

Despite the development of AlwaysOn in recent releases of SQL Server, log shipping is still a great way to set up a copy of databases to be used for reporting. One of the main reasons it is great is because, unlike AlwaysOn, it is available in less expensive editions like Standard and Web from SQL Server 2008 onwards. Sure, in 2016 AlwaysOn will be available in Standard, but in a greatly deprecated form, and you cannot read from the secondary. So it will be good for DR, but not for reporting (as an aside it still might be easier to set up log shipping for DR than AlwaysOn Basic because you need to setup a failover cluster. Read through the “how to set up Standard Edition Availability Groups” here.) However you do need to be careful though when setting up log shipping across different editions of SQL Server: whilst you can log ship between Enterprise to Standard/Web, if the database uses any Enterprise features then you’ll need to log ship to an Enterprise edition of SQL Server. And because you’ll be using the database for reporting, you’ll need to get it licensed. Continue reading “Log Shipping: It’s Better Than Bad It’s Good!”

Useful Log Shipping Script

Like all DBA’s, I have a collection of random scripts put together over time that reside in a folder and are then frequently forgotten about. Today I found a file called “useful log shipping script”, and being the generous guy I am, I’m putting it up here.

SELECT ls.primary_server AS [primaryServer]
,ls.primary_database AS [primaryDB]
,lsd.secondary_database AS [secondaryDB]
,lsd.restore_delay AS [restoreDelay]
,CONVERT(VARCHAR, DATEDIFF(SECOND, lms.last_restored_date, GETDATE()) / (60 * 60 * 24)) + '_' + CONVERT(VARCHAR, DATEADD(s, DATEDIFF(SECOND, lms.last_restored_date, GETDATE()), CONVERT(DATETIME2, '0001-01-01')), 108) AS [timeSinceLastRestore dd_hh:mm:ss]
,CONVERT (VARCHAR (20), lms.last_copied_date, 120) AS [lastCopiedDate]
,CONVERT (VARCHAR (20), lms.last_restored_date, 120) AS [lastRestoredDate]
,lms.last_copied_file AS [lastCopiedFile]
,lms.last_restored_file AS [lastRestoredFile]
,lsd.disconnect_users AS [disconnectUsers]
,ls.backup_source_directory AS [backupSourceDirectory]
,ls.backup_destination_directory AS [backupDestinationDirectory]
,ls.monitor_server AS [monitorServer]
FROM msdb.dbo.log_shipping_secondary ls
INNER JOIN msdb.dbo.log_shipping_secondary_databases lsd ON lsd.secondary_id = ls.secondary_id
INNER JOIN msdb.dbo.log_shipping_monitor_secondary lms ON lms.secondary_id = lsd.secondary_id;

Happy Scripting!

Finding Locks on Tables

Recently I needed to find a stored procedure that was creating a lock on a table and updating the table regularly. So my starting point was the table name, but I needed to find where the commad was coming from that was locking the table, and what stored procedure was doing the update, if any. As I could not be sure that it was a sproc, I wanted to make sure that I also got the program name. Continue reading “Finding Locks on Tables”

EXEC sp_executeSQL

Today I am going to talk about the stored proc sp_executesql and the benefits it has over EXEC @SQL. Microsoft actually recommend that you use the sp_executesql stored procedure instead of the EXECUTE statement. This is because of something called parameter substitution, which makes Sp_executesql a far more efficient method of executing dynamic SQL.  Continue reading “EXEC sp_executeSQL”

Service Pack – SQL Server 2008 SP4

Despite the fact that end of mainstream support ended for SQL Server 2008 earlier this year, Microsoft have released SQL Server 2008 Service Pack 4. This arrives a few days after Service Pack 3 for SQL Server 2008 R2, but three years since the last one for this version of SQL. That’s a lot of updates!

Continue reading “Service Pack – SQL Server 2008 SP4”