Part one of this post concerned the notion of SSDT and Pre-Model Scripts. This post relates to an actual implementation of executing a SQL script using Octopus Deploy. My pull request on Github was approved, so my step template can be exported from the Octopus Library.
The only part that I feel requires some explanation is the SQL Scripts and the option to select a previous step. If you’ve read part one of this post you’ll know that you keep the pre-model scripts in a folder in the database solution. These are not compiled (ie not in build) but are included as part of the output and packaged up in the NuGet package. When the NuGet package is unpacked, it’s then a case of locating the scripts and executing them.
First, the order. Extract the NuGet package that includes the dacpac and the post-deploy scripts and then add the execute SQL step:
In the step template for execute SQL, there are Octopus parameters. These we define the step that the scripts were unpackaged, and the final path to the scripts themselves:
And these are the values of the parameters:
So now when we execute the step, we can see which scripts were executed:
Two of my most recent posts have concerned pre-model scripts; those scripts that need to be run before the dacpac is compared against the production database. These are sometimes necessary. And this necessity is usually because SSDT may not produce a script that is optimal. One such example is an index creation: no index is created “ONLINE”, and this can be a problem if the database being deployed is online during this operation. And can be even worse if the table is particularly large.
In respect to the SSDT team, I can see why this is the case: some versions of SQL have the online index feature, some don’t. So one solution may be to write a pre-model script that will create an index with the online operation included. And while there’s nothing wrong with this, there is an alternative: deployment contributors. Continue reading “The Alternative to Pre-Model Scripts”
I recently had a comment posted to one of my first articles I wrote way back in 2012 concerning adding a column that detailed the used space as a %age. So without any further ado, here is an updated version of a script originally blogged about here.
sf.FILEID AS [File ID],
[File Size in MB] = convert(decimal(12,2),round(sf.size/128.000,2)),
[Max Size in MB] = convert(decimal(12,2),round(sf.maxsize/128.000,2)),
[Space Used in MB] = convert(decimal(12,2),round(fileproperty(sf.name,'SpaceUsed')/128.000,2)),
[Free Space in MB] = convert(decimal(12,2),round((sf.size-fileproperty(sf.name,'SpaceUsed'))/128.000,2)) ,
[Free Space in %] = convert(decimal(12,2),round(100*(sf.size-fileproperty(sf.name,'SpaceUsed'))/128.000,2)/(sf.size/128.000)) ,
[Used Space in %] = convert(decimal(12,2),round(100*(fileproperty(sf.name,'SpaceUsed'))/128.000,2)/(sf.size/128.000)) ,
[File Name] = left(sf.NAME,30),
[File Location] = left(sf.FILENAME,100)
from dbo.sysfiles sf
order by fileid asc
A real quickie this evening, and it’s something I discovered yesterday the hard way: using NOLOCK table hints can cause blocking. Yep, you read that right. Given the fact that the table hint is called NOLOCK it’s counter intuitive to what you’d instinctively think, but it’s fact and by design and also, there’s nothing wrong with it. Continue reading “When NOLOCK Locks”
When extracting Dacpacs it’s quite straightforward to go through the UI and extract, however what if you want to do this for more than one database, or automate the process for a build? The sqlpackage.exe enables us to extract a dacpac in silent mode. Below is the command, and I will go through each parameter:
"C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\sqlpackage.exe" /action:Extract /OverwriteFiles:False /tf:"C:\Users\richard.lee\Documents\dacpacs\dbname_Baseline.dacpac" /SourceConnectionString:"Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=AdventureWorks;Data Source=SQLA"
- /action: we want to extract the dacpac to a location. The other options available for the action parameter are: Extract | DeployReport | DriftReport | Publish | Script | Export | Import
- /tf: short form for the parameter /targetfile. This is the location of where we want to create the dacpac to. If a dacpac of the same name already exists then it will overwrite that dacpac.
- /sourceconnectionstring: the short form for this parameter is /scs. This connects to the database you want to extract the dacpac from.
- /OverwriteFiles: Specifies if sqlpackage.exe should overwrite existing files. Specifying False causes sqlpackage.exe to abort action if an existing file is encountered. Default value is True
Looking at the MSDN site for sqlpackage.exe it is clear that there are many more parameters available, but for my database, these are the only ones I need.
The /tf parameter is also the only /target parameter that is accepted when the action is set to extract, and all /source parameters are valid except for /sourcefile.
Some you need to be wary of:
- /p:IgnorePermissions: if you have the permissions scripted out for all the service accounts and users, or they exist in the database solution then it would make sense to set this to false. By default, users with sys_admin on the instance will not lose their access. Ultimately what you choose here will depend on whether you also have the IgnorePermissions property set in the publish.xml file for when you script out the deployment file.
- /p:Storage: I’ve blogged in depth about this one before here but to sum up; if you have lots of memory on the servers that you are extracting from, or they’re not production servers, or the databases are small, then keep this at memory. However if you don’t want to impede performance on a data warehouse server, or there is no spare memory, then set this option to file. It may take longer but it is worth it in the long term.
- /p:ExtractAllTableData: It’s important to note that when you are extracting a dacpac,if you set the /p:ExtractAllTableData=true then you will not be able to specify individual tables using the /p:TableData property. In order to specify one or more user tables from which to extract data set /p:ExtractAllTableData=false.
Here is the MSDN site for all the parameters that are available for SqlPackage.exe.
I feel like I haven’t posted in a while despite posting only yesterday. This is because I wrote all the WiX posts in a batch and have them on auto release (definitely a good idea to write all the posts for s a series and have them done before anything else.) Continue reading “View Partitions in Database Partition Schema”
Last year saw the release of both SQL Server 2012 and Windows Server 2012. Both bought significant updates, including changes to High Availability and Disaster Recovery for SQL Server 2012, and a focus on PowerShell and Server Core features for Windows Server 2012, not to mention improvements to Hyper V, which could lead my company to virtualize the only boxes in our company that are still physical, namely the database and SSAS boxes. So before we make this leap it’s important to hear from those who have experience and can pass it on. Apart from reading MSDN and trying out a few labs, there are a couple of user groups that I will be attending:
Continue reading “SQL Server 2012 Unknown Unknowns”
registration to attend SQL Bits is now open.
There’s a 600 capacity limit and they anticipate it to fill rather quickly. More details can be found here:
Hope to see you there!
SQLBits XI has been announced for the 2-4th of May 2013. Having taken place in London this year, it is moving back North to the Midlands at the East Midlands Conference Centre. Apparently there are great transport links, and being smack in the centre of England I guess it makes it convenient for everyone to get to.
If you work with SQL Server in any way, and you’ve never been to SQLBits I’d highly recommend it, because
- it takes place over a weekend, so only requires one day away from the office
- it’s free! So aside from room and traveling expenses, it’s a very low cost conference.
- the quality of past speakers has been very high. SQL Experts such as Brent Ozar, Buck Woody, Ola Hallengren, Kevin Kline… the list goes on. I came away with a lot of ideas to try out at work, and certainly learnt of a few valuable resources out on the web.
Try and convince your boss that it is worth your while. Research the sessions that are taking place and outline the ones that are pertinent to your professional development; does attending help you achieve any of your objectives covered in your appraisal? If there are other people in your team who might also benefit from the knowledge learned, try to have a plan for how you intend to communicate what you have learnt to the team. At the moment they are not open for registration, they are just taking session submission, but keep checking back as registration fills up quick!
(There is a more up to date version of this script here)
Hey folks, and welcome to my first proper blog post. One of the things that I like to monitor through my daily checks in the file size and the free space of each file within the databases. During a busy loading period, some databases can grow massively, so it’s important to keep an eye on the growth. Through the UI, SSMS only gives us the size of the files and what the max size is that they can grow to, which is clearly not very useful. Fortunately, a quick query on the dbo.sysfiles of each databases that we want to monitor gives us some info:
but this isn’t entirely too useful. For starters the size is in KB. This makes sense as databases store data in 8kb sized files. Whilst it may be OK for a small database like this one, our data warehouses are far to big for us to find sizes in KB useful. Also, we can infer the remaining space, but again it’s not too helpful by having to figure this out for ourselves.
Continue reading “SQL Server Database File Size and Free Space”