Skip main navigation

Databased Deployments

.
15.3
Pat yourself on the back. You have now a MS Build based web deploy package ready. In this video, we’re going to take this a step forward and talk about creating database packages for database deployments. Now, you can’t really have a devops process where you have siloed teams taking care of web applications but then a separate team taking care of database applications. And unless you merge the two together, you’re always going to have this. Or if my web components are ready and I’ve got a package that can deploy web but the DBAs kind of do it manually, so I’ll have to raise a service ticket and engage them, and then they log onto the server, run a script. Can’t scale.
54.3
Can’t scale that operation. So unless you bring your database into the same devops process that your development teams are using, you’re not going to be able to fully automated a pipeline. Now, Visual Studio has always supported database projects natively within the IDE. And in this demo, we’ll look at how you can add a new database project simply importing an existing database schema that you may already have, how easy it is to create scripts, have those scripts run as post deployment steps, and then how easy it is to plug that in into a Build pipeline to create what’s called a DACPAC file. Now, in this video, we’ll only cover creating the DACPAC file.
95.5
In a future video, we’ll talk about how we can take that DACPAC file and deploy it into a database that’s alive. And if you’re doing Greenfield development, that process only gets simple. All right. In this video, we’re going to focus exclusively on SQL packages. And let’s get started with a demo. OK, so I’m going to come into Visual Studio and the same summit solution that we’ve been working on from the previous demo, I’ve simply added a new database project in there. Let me walk you through what I have done here. If I right click the solution and choose Add and choose New Project, I have an option of SQL server as a project type.
135.5
Now, you could go into SQL server, select this project type, and give it a name. After you’ve done that, you would have a SQL server type of project in your solution. Now, if you already have a schema available that you want to bring in to version control, then hover over to import and you can either import an existing data tier application or database or script. In my case, I’ve created this very simple schema from scratch, and I was able to do it because of the SQL server tools that are available natively within Visual Studio. So let me give you a quick view of what I’ve created here. I’ve created a campaign table.
172.2
This campaign table records the campaigns that the marketing team is running. And then on the back of that, I’ve got a campaign details table which records the start and the end and the percentage discount and whether the campaign is active or not. A very simple schema. But I know for a fact that when I roll this out in a test environment, the first thing I’m going to get is can we have all our campaigns pre-loaded in the database? And you could go ahead and script that. If you know SQL, it’s fairly easy to create a script.
200.3
But rather than creating just blank insert statement scripts, please pad your scripts with a sanity check to check whether the data is already there or not. And the reason for doing that is then your script is fully [INAUDIBLE]. You could run the same script again and again against the same database. And if the record already exists, then it’s not going to insert the record. By doing so, you enable your script to be run from an automated process. If the script is pretty dumb and just vaguely inserting data regardless of checking whether the data already exist, then people are going to say, oh, it’s a one off script.
233.5
Let’s just run it manually, because if we run it multiple times, it’s just going to insert useless data. Now, once we have the script in place and it’s padded to check whether the data exists or not, you could simply go ahead and add a post deployment script. Now, the difference between a normal script and a post deployment script is in the properties file. If I look at the property files for this, property of this file you would see that the build action is set to no. Which means while this script would be copied into the destination folder, it’s not going to be checked in the compile process, whereas the post deployment script has a build action.
271.5
We’re telling that this is a post deployment script. After the compile process finishes, pick this script up and execute it. Now, within the script, I’m using SQL mode. SQL CMD mode. And within SQL CMD mode, I’m saying as part of this script, I want you to go and pick up this script that I’ve created, which has got the data I need inserted. Now, rather than having one mega post deployment script, what I’ve here done is I’ve broken this post deployment script into individual subscripts that I can control through the post deployment script. Now, for example, I’m saying campaign for April 2017.
306.9
If I was to create campaigns for May 2017, then I could have a separate SQL script which gets called from the post deployment script. So it kind of all nicely ties in together. So if I close this, now that we’ve seen what the solution is, we’re going to go back into the build definition. This is the same build definition from the previous video. You can see I haven’t made any changes here.
328
And because of the nice integration between Visual Studio database projects MS Build, without making any further changes, if you run the same build definition, it’s going to detect that there are additional outputs generated in the bin file of the configuration type that you’re running the build for, and it will include that as an artefact once it publishes the artefact. As you can see in the build solution, it’s simply building what we’re telling it to build. We aren’t saying go and build the DB project specifically. It just picks that up by looking at the solution configuration file here. Let’s– Configuration Manager. You can see that we’re telling compile this project as part of the MS Build.
370.6
It just picked that up, compiles it. We don’t make any further changes. And then as part of the publish process, it says I’m going to look in the bin folder of the build configuration. And the build configuration is released in this case and find anything and publish it as an artefact. So let’s run this up and see what the output looks like.
402.8
Great. As you can see, the build succeeded. If we go back into the build definition and look at artefacts and click Explore, you would see that in addition to the test and the web deploy package, we have a new folder called Summit DB. And if we navigate it into the Build folder and release, you can see that it’s not only taken the DLL that’s compiled from the DB project. It’s also created a DACPAC file and has the SQL script as part of the post deployment step as well. Now, you don’t see the post deploy script separately, because it’s included within the DACPAC file. But the DACPAC then has a reference to this file.
438.4
Considering all of this is in one folder, when we trigger the deployment, it will do a full fidelity schema compare of the target database with the DACPAC contents, generate an alter script or a deploy script on the fly, and then do the deployment. But we’ll see this in action in a future video when we go off to deploy our database. This video was very much focused on just creating an output by importing an existing database into a database project.
467.7
So in this video, we saw how easy it is to bring your database into a version control system, how easy it is to create post deployment script, and then padding those scripts with checks so that they are [INAUDIBLE] and can be run multiple times. And then using the same build definition, leveraging the integration that Visual Studio has with all of this tooling is, without taking any further steps, we were able to create a DACPAC file. Now that we have a build process that creates a web package, creates a database package, I think we’re ready to explore how we can start releasing this cool stuff that we’ve created using our build pipelines.
507.4
So join me in the next video where we’ll start to look into that.

Well done! In the previous step, you have created an MSBuild based web deploy package. We are going to take what we have done a step further and talk about creating database packages for database deployments and bringing your databases into the same DevOps process.

The aim of creating database packages for database deployments is to bridge the gap that often exists between developers and database administrators. It is not uncommon to have silo teams taking care of web applications and database applications.

Unless these teams are merged, you will always have a slowdown in the deployment process where web components are ready and packages can be deployed but DBAs manually run scripts on receipt of service tickets.

Visual Studio natively supports database projects. This can be done by simply importing an existing database schema. When you deploy databases in a release pipeline, you have a unique set of issues that you can mitigate by using the various options for deployment.

Prominent among the deployment options are DACPAC and Script Deployments.

DACPAC

DACPAC is a data-tier application (DAC) package or DACPAC. A DAC is a self-contained unit of Microsoft SQL Server database deployment that enables developers and administrators to package SQL Server objects into a portable artefact called a DAC package, also known as a DACPAC.

The advantage of DACPAC is that the deployment tool assists in identifying and validating behaviour in different target and source databases. It also warns if upgrades might cause data loss and provide an upgrade plan.

For more information on DACPAC, see https://docs.microsoft.com/en-us/sql/relational-databases/data-tier-applications/data-tier-applications

For more on DACPAC deployment, see https://docs.microsoft.com/en-us/sql/relational-databases/data-tier-applications/deploy-a-data-tier-application

Script Deployments

A script deployment is a SQL Server script with data definition language (DDL), stored procedure, function and data scripts (pre- and post-scripts), which can be generated from a database build or schema compare, or manually.

Scripts provide the greatest flexibility for complex or non-standard behaviour, but foregoes any warning mechanisms, thereby making deployment riskier. Additionally, rollback behaviour might need to be customized as a separate script.

The risks in this approach, especially when assembled manually, can range from determining which script should be run and in which order, to determining whether the scripts safe to run multiple times or require rollback to rerun.

For more on SQL script deployment, see https://www.visualstudio.com/en-us/docs/build/apps/aspnet/cd/deploy-database-sqlscripts

DACPAC Deployment in Visual Studio Team Services

The process involves a few basic steps:

  1. Add a build step to publish the DACPAC.

  2. Add a release task to copy the DACPAC to the target database VM, and then add a variable for the admin login and password.

  3. Add a release SQL Server DACPAC task.

Releasing a DACPAC using Windows PowerShell Script

Besides Visual Studio Team Servies, you can use a Windows PowerShell script to release a DACPAC, as in this example:

  1. Set an SMO Server object to the default instance on the local computer.
CD SQLSERVER:SQLlocalhostDEFAULT 
$srv = get-item .

2. Open a Common.ServerConnection to the same instance.

$serverconnection = New-Object Microsoft.SqlServer.Management.Common.ServerConnection($srv.ConnectionContext.SqlConnectionObject) 
$serverconnection.Connect()
$dacstore = New-Object Microsoft.SqlServer.Management.Dac.DacStore($serverconnection)

3. Load the DAC package file.

$dacpacPath = "C:MyDACsMyApplication.dacpac" 
$fileStream = [System.IO.File]::Open($dacpacPath,[System.IO.FileMode]::OpenOrCreate)
$dacType = [Microsoft.SqlServer.Management.Dac.DacType]::Load($fileStream)

4. Subscribe to the DAC deployment events.

$dacstore.add_DacActionStarted({Write-Host `n`nStarting at $(get-date) :: $_.Description}) 
$dacstore.add_DacActionFinished({Write-Host Completed at $(get-date) :: $_.Description})

5. Deploy the DAC and create the database.

$dacName = "MyApplication" 
$evaluateTSPolicy = $true
$deployProperties = New-Object Microsoft.SqlServer.Management.Dac.DatabaseDeploymentProperties($serverconnection,$dacName)
$dacstore.Install($dacType, $deployProperties, $evaluateTSPolicy)
$fileStream.Close()

Firewall Connections and SQLCMD Utility

Script deployment in VSTS involves using PowerShell Tasks to enable firewall connectionss and drive the SQLCMD utility, which will deploy the database:

  1. Add a release PowerShell task to add firewall rule to allow connection to the Azure SQL Database.

  2. Add a release PowerShell task such as:

    • SQLCMD

    • Arguments: -S {database-server-name}.database.windows.net -U {username}@{database-server-name} -P {password} -d {database-name} -i {SQL file}.

    Note: {SQL file} refers to an artifact source created as a build step; for example $(System.DefaultWorkingDirectory)/devops200.3demo-repo/demonstrationdatabase.sql

  3. Add a release PowerShell task to remove the firewall rule create in the first step. Removing the firewall rule restores the desired state of security for the VM.

For more information on DACPAC, see https://docs.microsoft.com/en-us/sql/relational-databases/data-tier-applications/data-tier-applications

For more on DACPAC deployment, see https://docs.microsoft.com/en-us/sql/relational-databases/data-tier-applications/deploy-a-data-tier-application>

For more on SQL Script deployment, see https://www.visualstudio.com/en-us/docs/build/apps/aspnet/cd/deploy-database-sqlscripts

The following are examples of scripts used to add and remove firewall rules:

Add Firewall Rule

[CmdletBinding(DefaultParameterSetName = 'None')] 
param
(
[String][Parameter(Mandatory = $true)] $ServerName,
[String] $AzureFirewallName = "AzureWebAppFirewall"
)

$ErrorActionPreference = 'Stop'

function New-AzureSQLServerFirewallRule {
$agentIP = (New-Object net.webclient).downloadstring("http://checkip.dyndns.com") -replace "[^d.]"
New-AzureSqlDatabaseServerFirewallRule -StartIPAddress $agentIp -EndIPAddress $agentIp -RuleName $AzureFirewallName -ServerName $ServerName
}
function Update-AzureSQLServerFirewallRule{
$agentIP= (New-Object net.webclient).downloadstring("http://checkip.dyndns.com") -replace "[^d.]"
Set-AzureSqlDatabaseServerFirewallRule -StartIPAddress $agentIp -EndIPAddress $agentIp -RuleName $AzureFirewallName -ServerName $ServerName
}

If ((Get-AzureSqlDatabaseServerFirewallRule -ServerName $ServerName -RuleName $AzureFirewallName -ErrorAction SilentlyContinue) -eq $null)
{
New-AzureSQLServerFirewallRule
}
else
{
Update-AzureSQLServerFirewallRule
}

Remove Firewall Rule

[CmdletBinding(DefaultParameterSetName = 'None')] 
param
(
[String][Parameter(Mandatory = $true)] $ServerName,
[String] $AzureFirewallName = "AzureWebAppFirewall"
)

$ErrorActionPreference = 'Stop'

If ((Get-AzureSqlDatabaseServerFirewallRule -ServerName $ServerName -RuleName $AzureFirewallName -ErrorAction SilentlyContinue))
{
Remove-AzureSqlDatabaseServerFirewallRule -RuleName $AzureFirewallName -ServerName $ServerName
}
This article is from the free online

Microsoft Future Ready: Continuous Integration Implementation

Created by
FutureLearn - Learning For Life

Our purpose is to transform access to education.

We offer a diverse selection of courses from leading universities and cultural institutions from around the world. These are delivered one step at a time, and are accessible on mobile, tablet and desktop, so you can fit learning around your life.

We believe learning should be an enjoyable, social experience, so our courses offer the opportunity to discuss what you’re learning with others as you go, helping you make fresh discoveries and form new ideas.
You can unlock new opportunities with unlimited access to hundreds of online short courses for a year by subscribing to our Unlimited package. Build your knowledge with top universities and organisations.

Learn more about how FutureLearn is transforming access to education