SSDT And Pre-Model Scripts Part 2: Execute Pre-Model Scripts Using Octopus Deploy

Hello!

Part one of this post concerned the notion of SSDT and Pre-Model Scripts. This post relates to an actual implementation of executing a SQL script using Octopus Deploy. My pull request on Github was approved, so my step template can be exported from the Octopus Library.

The only part that I feel requires some explanation is the SQL Scripts and the option to select a previous step. If you’ve read part one of this post you’ll know that you keep the pre-model scripts in a folder in the database solution. These are not compiled (ie not in build) but are included as part of the output and packaged up in the NuGet package. When the NuGet package is unpacked, it’s then a case of locating the scripts and executing them.

First, the order. Extract the NuGet package that includes the dacpac and the post-deploy scripts and then add the execute SQL step:

steporder

In the step template for execute SQL, there are Octopus parameters. These we define the step that the scripts were unpackaged, and the final path to the scripts themselves:

stepoptionsAnd these are the values of the parameters:

 

variablesSo now when we execute the step, we can see which scripts were executed:

execute

When To Use Octopus Deploy Script Modules

Hello!

Lately I’ve been thinking a lot about script modules in Octopus Deploy. Script Modules, for those of you not ITK, are a collection of PowerShell cmdlets or functions that exist outside of a script step that can be used across multiple steps across different projects/environments. They don’t have to be used to contain modules exclusively; they can just contain that will get executed upon the initial import of the module. The issue here is that these script modules are imported or every step in a process. So unless the code you write is idempotent, you’ll probably cause an error in the process. This is by design.

So at first I thought “put nothing in script modules” because they’re pretty pointless, right? Anything you write will pertain to a step exclusively, so why do you need to share code across steps. Turns out quite a bit of code is shared across steps, particularly checking and validating parameters.

Then, as step templates got more sophisticated they got harder to read, so then I thought “put everything in script modules!” It aided in legibility of the step templates and allowed them to focus mainly on setting up a process and using script modules to execute the function.

There is however an issue to this: script modules are the “magic” of a step template. If you’re not aware that a step template requires a script module, how can you communicate this implicitly? You could mention that a script module is required in a step template, but this is still a manual process that everyone remembers, or even buys into.

There is also another issue: I had to make a change to a step template that required a change to the script module. The step template in question was in heavy use, and so careful planning was required to make sure that the change in the step template and the script module was not a breaking change: basically what I mean is that whereas a step template has to be explicitly updated for a given step in an deployment pipeline, the same cannot be said for a script module. So whatever is the latest version of the script template is what a user gets. And if you make a few changes to the step template in an iterative manner, you have to be sure that if anyone updates the step template then all version have to match one another. Continue reading “When To Use Octopus Deploy Script Modules”

SSDT and Pre-Model Scripts

Hello!

Earlier this week I posted about the need to create the database as a pre-model tasks: that is creating a database before we run sqlpackage/dacfx against a database/server so that the script itself does not create the database.

Going a step further than this, pre-deploy scripts in SSDT scripts may be executed before the main diff script is executed, but not before the diff script is generated. This is an important fact to understand. If you want scripts to be executed before the diff is generated, then you need to execute pre-model scripts.

How you go about this is up to you, but there needs to be a few rules that must be followed:

Idempotent: big word, and is used in maths. But don’t let that put you off. What it means in this context is that a script can be re-run and the same result happens. So in the context of altering a primary key, the pseudo-code would look something like this:

if database exists then
if table exists then
if primary key name eq “oldname” then
drop primary key then
add primary key with “newname”

and then you add primary key “newname” to the solution. That way the drop and add part of the script will only ever be run once, or again against an older version of the database that has the old primary key.The “newname” of the primary key guarantees that this will be the case.

Stored in the Solution: within database solutions you can add scripts that are part of the solution but are not executed:

checksIf you copy the file to the output, what you’ll be able to do is add the files required to your nuspec file and get them deployed as part of the NuGet package. Neat!

premodelnuspec

So if you have the scripts in the solution and re-runnable then you can be confident that they will run against older versions of the database. It keeps a level of audit and history of the database.

Now, how to run the scripts. As with the create database script, SMO is probably the easiest way to do this. You could use SQLCMD, or Invoke-SqlCmd, but it’s entirely up to you.

Now if you want to use SMO, and you are deploying using Octopus Deploy, then you’re in (some) luck: there are scripts available in the Octopus Library to do something like this, but you may will have to alter them to execute a folder full of sql scripts. Fortunately, I already have completed this, and I will add my script to the GitHub solution to be available on the Library. When this is available, assuming it meets the quality control, I’ll blog about how to use it.

edit: it is now available on the Octopus Library. There is also a part two on how to use the step template.

Using TeamCity To Raise Octopus Defects Updated

Hello!

I recently updated the script that uses both the TeamCity and Octopus API’s to raise defects if a TeamCity build has failed (the reasoning here is tat a deployment has passed on Octopus, but subsequent automated testing has failed.)

This script uses functions and is a little neater than the previous version. You will have to create a snapshot dependency between the build that this step runs in and the build that deploys/runs the tests. You will also need to add parameters to the TeamCity build that runs this script to populate the parameters.

Essentially the “UpdateDefects” function is called, and if that finds that the build has failed then it will run “RaiseDefect”.

That’s all for today, have a good one. Happy scripting!


param(
[parameter(Mandatory=$true)] [string]$buildType,
[parameter(Mandatory=$true)] [string]$dependentBuildNumber,
[parameter(Mandatory=$true)] [string]$epicFail,
[parameter(Mandatory=$true)] [string]$root,
[parameter(Mandatory=$true)] [string]$user,
[parameter(Mandatory=$true)] [string]$pass,
[parameter(Mandatory=$true)] [string]$api,
[parameter(Mandatory=$true)] [string]$OctopusApiKey,
[parameter(Mandatory=$true)] [string]$ProjectName,
[parameter(Mandatory=$true)] [string]$ReleaseVersion
)

Function UpdateDefect
{
$build = "$buildType #$dependentBuildNumber"
$secpasswd = ConvertTo-SecureString $pass -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($user, $secpasswd)
$result = Invoke-WebRequest $root -Credential $credential -UseBasicParsing
[xml]$xml = $result.Content
[string]$status= $xml.builds.build | select -first 1 | Format-Table -AutoSize -Property status -HideTableHeaders | Out-String
$status = $status.Trim()
if ($status -eq $epicFail)
{
write-host "Build $build Failed. Raising a defect."
RaiseDefect
}
elseif ($status -ne $epicFail) {write-host "Build $build has not failed. Not raising a defect."}
}
$buildType
Function RaiseDefect
{
[System.Uri]$defectMessage = "This is a defect raised by failing TeamCity build $build. Please refer to the logs of the build for the reason of the failure."
$header = @{ "X-Octopus-ApiKey" = $OctopusAPIKey } #header for web requests
$defectDescription = @{ description = $defectMessage } | ConvertTo-Json #description of defect is required when posting new defect

#retrieve project id; required for getting release
$projectUri = "$api/api/projects/$ProjectName"
$Project = Invoke-WebRequest -Uri $projectUri -Headers $header -UseBasicParsing | ConvertFrom-Json
$ProjectId = $Project.Id

$relUri = "$api/api/projects/$ProjectId/releases/$ReleaseVersion"
$releaseDetails = Invoke-WebRequest -Uri $relUri -Headers $header -UseBasicParsing | ConvertFrom-Json

$reportDefect = $releaseDetails.Links.ReportDefect #url to report defects
$defectApiUrl = $releaseDetails.Links.Defects #url to get info on defects
$resolveDefect = $releaseDetails.Links.ResolveDefect #url to resolve defects

$getDefectUrl = "$api$defectApiUrl" #make full defect url
$reportDefectUrl = "$api$reportDefect" #make full report url
$resolveDefectUrl = "$api$resolveDefect" #make full resolve url

#check if unresolved defect exists. If it does, then write-error and exit script
$StatusesofDefects= Invoke-WebRequest -Uri $getDefectUrl -Method Get -Headers $Header -UseBasicParsing | ConvertFrom-Json
$StatusesofDefects.Items.Status
foreach ($DefectStatus in $StatusesofDefects.Items.Status)
{
if ($DefectStatus -eq 'Unresolved')
{
Write-Error "An unresolved Octopus defect already exists. Resolve previous Octopus defects before attempting to raise another defect."
exit
}
}
#we'll only get here if there are no unresolved defects
write-host "raising defect for project" $ProjectName "for release version" $ReleaseVersion
Invoke-WebRequest -Uri $reportDefectUrl -Method Post -Headers $Header -Body $defectDescription -UseBasicParsing | ConvertFrom-Json
#commented out resolve defect. We don't need it to run, but it's here for future reference.
#Invoke-WebRequest -Uri $resolveDefectUrl -Method Post -Headers $header | ConvertFrom-Json
}

UpdateDefect

Using TeamCity To Raise Octopus Defects

In my previous post, I shared a script that used the Octopus API to create a defect, with the aim of it being added to a TeamCity build and chaining the build to a deploy/test build step in TeamCity: the aim being to raise an Octopus defect if a test fails whilst the deployment to the environment succeeded. You can read more about it here.

What makes this a challenge is that there is no way to have a chained build that runs if, and only if, a build has failed. So as with Octopus you have to use the TeamCity API. In this script I get the the status of the last build that deployed/ran the tests, and if this build succeeded I do nothing. So yes this chained build has to always run post deploy/test phase.

Where it gets interesting though is if the build failed. Here we raise a defect, but not before checking to see if there are any defects raised that are still unresolved, as only one Octopus defect can be unresolved at any one time. Continue reading “Using TeamCity To Raise Octopus Defects”

Creating and Resolving Defects Using Octopus API

Hello!

I’ve been using Octopus and Team City together as a build/deploy solution, and the more I use them together the more I like them. I have a lot of experience in using the TFS Build process, and used a combination of MSBuild and MSBuild to create semi-automated deployments. The web services etc were packaged up as MSI’s using WiX. I’ve posted about WiX before, and have not attempted to hide my contempt for WiX. Fortunately NuGet has come along and has all but removed the need for using WiX. Fortunately for me I am firmly ensconced in working with database deployments, and so NuGet packaging is a straightforward process. Continue reading “Creating and Resolving Defects Using Octopus API”

Connect VSTS Package Management to Octopus

As part of the Visual Studio Team Services offering, Microsoft have added a Package Management module. Though still only in preview mode, you are able to set up feeds and then publish NuGet packages either through NuGet command line or as a step in a Visual Studio build. I’m going to assume you’ve read the documentation and set up a feed, and that as part of your build you wish to publish a NuGet package to your new feed, and then you want to add the feed as a service on Octopus. I’m also going to assume you are using the NuGet package step instead of OctoPack to make your NuGet packages. Continue reading “Connect VSTS Package Management to Octopus”