Tech blog of Jon Ryan, Web Developer & Geek

Addressing pain points of CI/CD for Databases - Part 2

In Part 1 I covered a number of problems I had getting the build pipeline set up to detect drift between what EF Core expects and a source-controlled database project.

I am now going to go through the pain points I had when it came to the release pipeline and getting the database changes out into production in Azure.

What I began with

At the end of part 1, I had a .dacpac representing the database being published as an artifact for use by the release pipeline.

Problem 1: Will the .dacpac deploy to the live database OK?

I wanted to use the “Azure SQL Database deployment” task to deploy the .dacpac but felt uneasy about sending it straight to the live database in case it does some sort of partial update and fails. I also liked the JUnit test results from Part 1 and wanted to do similar in the release pipeline to flag any issues before trying to release to a live database.


I decided the best thing to do, would be to create a copy of the live database using the “Azure CLI” task. I then went through the same steps as the build pipeline:

  • Run an “ubuntu-latest” agent
  • Install SQL Package
  • Create A deployment report
  • Convert the deployment report to JUnit format using an XSLT file

This time around instead of the JUnit report failing on any changes (as that would make no sense, that’s kind of the point). Instead, I just wanted it to flag the alerts for potential data loss. So the XSLT file this time around looks like this:

<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet version="1.0"

	<!-- Define the output format -->
	<xsl:output method="xml" indent="yes" encoding="UTF-8"/>

	<!-- Match the root element of the source XML -->
	<xsl:template match="/">
		<testsuites time="0">
			<testsuite name="DeploymentReport" time="0">
				<xsl:apply-templates select="//dac:Alert[@Name='DataIssue']"/>

	<xsl:template match="dac:Alert">
		<xsl:for-each select="dac:Issue">
				<xsl:attribute name="name">
					<xsl:value-of select="../@Name"/>
					<xsl:number value="position()" format="1" />
				<xsl:attribute name="classname">
					<xsl:value-of select="../@Name"/>
					<xsl:number value="position()" format="1" />
				<xsl:attribute name="time">0</xsl:attribute>

					<xsl:attribute name="type">
						<xsl:value-of select="../@Name"/>
					<xsl:attribute name="message">
						<xsl:value-of select="@Value"/>


Problem 2: How to fix issues when detected

The JUnit test results instantly flagged an issue due to a table being significantly changed. I needed to create some script files to perform some manual migration and table modifications. At first, I looked at the pre-deployment script support built into the SQL database project, but the deploy report generated by SQLPackage doesn’t take these into account when generating a report (not helpful).


As well as deploying a .dacpac, the “Azure SQL Database deployment” task can also deploy a SQL script file. So I got the Pre-Deployment scripts to be copied as artifacts in the build for use in the release pipeline to run against the copied database before running the deployment report.

Problem 3: There can be only one…

…SQL file. The “Azure SQL Database deployment” task is only capable of deploying a single SQL script file. So I needed to merge the scripts from the Pre-Deployment folder into a single SQL script file.

I tried using cat:

cat *.sql > combined.sql

But I needed a line break between the joined scripts. So next I tried:

for f in *.sql; do (cat "${f}"; echo) >> combined.sql; done

That resulted in some kind of encoding issue though…


In the end, I used a Powershell script to merge them:

- task: PowerShell@2
  displayName: Merge SQL Pre-Deployment Files into one

    targetType: 'inline'
    script: |
      $rootFolder = "$(System.DefaultWorkingDirectory)/Database/Pre-Deployment"
      $outFile = "$(System.DefaultWorkingDirectory)/Database/Pre-Deployment/combined.sql"
      $sw = New-Object System.IO.StreamWriter $outFile, true  # $true is for Append
      Get-ChildItem -Path $rootFolder -Filter '*.sql' -File | ForEach-Object {
          Get-Content -Path $_.FullName -Encoding UTF8 | ForEach-Object {

Final Result

So in the “DB tests” phase, I ended up with:

Windows Agent

  • Azure CLI making a copy of the live DB
  • Azure SQL Database deployment task deploying the combined pre-deployment scripts

Ubuntu Agent

  • Install SqlPackage
  • Create Deploy Report
  • Convert report to JUnit format (catching data loss alerts) using XSLT
  • Publish JUnit test results

Windows Agent

  • Azure SQL Database deployment task deploying the .dacpac
  • Azure CLI dropping the copy of the live database

Then for the live database deployment simply:

Windows Agent

  • Azure SQL Database deployment task deploying the combined pre-deployment scripts
  • Azure SQL Database deployment task deploying the .dacpac

Final Thoughts

Deploying a database with confidence, for me, required multiple checks:

  • Check the source-controlled database schema in sync with EF Core configuration
  • Check that if deployed to the live database no data loss will occur
  • Do a dry run on a copy of the live database just to make sure

Those checks mean that I can check code in with confidence now, knowing that things will only be stopped if there is an issue and if not it will all deploy without any intervention on my part. Happy days.

If any of this has helped you, then my pain and frustration were worth it.

Written on June 21, 2024