Continuous Delivery series part one

Continuous Delivery, who hasn’t heard of it these years. Dave Farley, one of our keynote speakers during the Engineer’s Week previous year, wrote an excellent book back in 2010 together with Jez Humble around this topic. In his book, he discusses a wide variety of techniques and principles to deliver your software in a most reliable way. Although the book was released a while ago, the techniques and principles are still valuable for every modern software development project. Therefore, in this series, you will be learning how you can leverage these techniques and principles on SQL Server database projects.

Most database developers should be familiar with using the SQL Server Data Tools (SSDT for short) to do their modern database development through Visual Studio. In recent years, Microsoft has spent time to bring the same developer experience through an extension in Visual Studio Code (and Azure Data Studio), called mssql for Visual Studio Code. This extension, which is still in preview, has the ability to create database projects based on the new SDK-style approach. This has several advantages for developers that are shipping databases through CI/CD pipelines or working in a cross-platform environment.

Without further ado, let’s look at some of the pre-requisites, as this blog will be a hands-on one!

Pre-requisites

This blog series will be hands-on. If you want to try it out yourself, you’re going to need:

  • Visual Studio Code installed – v1.77.3 is used in this blog
  • mssql for Visual Studio Code extension installed – v1.17.1 is used in this blog
  • Microsoft SQL Server Database Projects for Azure Data Studio and VS Code extension installed – v0.21.0 is used in this blog
  • Windows PowerShell v5.1

If you already have all pre-requisites in-place, let’s setup our local environment first and talk about the first principle before you’re going to have a glance how .NET SDK database projects are different compared the standard database projects created by SSDT.

Create the possibility to version the tools you need

For each database project you’re creating, you cannot escape the requirement on tools that are required to build and deploy your database. But with every tool you introduce, comes the management of it. For larger infrastructures that you’re managing, you might look to tools that do this in a declaratively way, for example:

  • Chef
  • Puppet
  • Ansible
  • System Center Configuration Management
  • Desired State Configuration
  • Terraform

What commonly is overlooked, are the workstations developers work on. While you can definitely manage these workstations through the tools mentioned above, it isn’t always possible to do so, especially in a Bring Your Own Device environment. Wouldn’t it be great if you could help them out by bootstrapping their environment from a simple script and pin the versions they need to build and possibly deploy the database? Yes, you might already be familiar with this concept if you’ve seen the 5 best practices for using Azure Bicep. To quickly recap it, bootstrapping is a technique to simply build up your development environment from scratch. Now you might argue that it could already have been done with the pre-requisites, but hey. You had to start somewhere, right?

Since you’ll be targeting the .NET SDK, you would also require the .NET SDK. Duh! Let’s set it up. This blog uses the C:\Repos\PoweringUpDatabase, whereas PoweringUpDatabase folder is the root.

  1. Create RequiredModules.psd1 file in the root which is the PowerShell Data file for defining dependencies you need for the project
  2. In the file, add the following content to specify the .NET SDK with a version and save it
@{ 
  
    'release' = @{ 
        DependencyType = 'DotnetSdk' 
        Version        = '6.0.408' 
        Target         = './.dotnet/' 
    } 
} 

3. Grab both Resolve-Dependency.psd1 and Resolve-Dependency.ps1 from the Sampler Github page to initialize the required package providers and resolving the dependencies against Invoke-PSDepend

PSDepend is a simple PowerShell dependency handler, where you specify the dependency you need in the data file, and it will retrieve it for you. Let’s just run the script and see what happens.

4. In your Windows PowerShell terminal, run .\Resolve-Dependency.ps1 

Figure 1: .NET SDK installation from Resolve-Dependency.ps1 script

Great. You now have the .NET SDK in your arsenal! Let’s compare the new structure against the standard SQL database project format.

Microsoft.Build.Sql

Microsoft.Build.Sql has a different approach when you look at the structure compared against the standard SQL database project file formats. It introduces wildcard selection by default. Any SQL file is picked up that includes the SQL project’s path without additional steps. Let’s see the differences between the two.

Figure 2: Standard database project when you create through Visual Studio
Figure 3: .NET SDK style database project created through Visual Studio Code

It is still possible to have the existing capabilities in SQL database projects, such as SQLCMD variables, pre-and post deployment scripts, and even database references. The format of these items remains unchanged. As already mentioned, since it is cross-platform, you’re able to use the dotnet build command line, meaning you might also build it from a Linux-based pipeline. Alright, enough about Microsoft.Build.Sql. Let’s look at the second principle and create the .NET SDK style database project.

Have your project folder structure standardized

Structuring your project folder is one of the first steps toward starting any software project. Anyone that looks for good project repositories in Github, knows this. But why does the project folder structure matter?

  1. It helps in understanding how it is structured, especially for new developers
  2. It helps separate your application code versus deployment or other files
  3. It helps in determining the absolute path vs relative paths, which further helps when creating automation scripts around your folder structure
  4. It helps defining various categories for different objects (stored procedures, schemas, tables)

Now you know why it is important to have your project folder standardized, start by adding the .NET SDK database project.

  1. Open Visual Studio Code
  2. Create a new folder called src in the root of the project
  3. Click Database Projects icon from the extensions to create a .NET SDK style database project

    Figure 4: Create .NET SDK style database project through extension
  4. Click Create new
  5. Select SQL Server Database
  6. Fill in the Project Name, in this blog PoweringUpDatabase is used
  7. Select Browse and select the folder src that you created earlier
  8. Select version 2019 as SQL Server version

Make sure that you’ve selected Yes in the last step, as this creates the .NET SDK style database project. If everything is successful, you should have the following folder structure created.

Figure 5: Creating the database solution structure

Later in the blog, you’ll be adding an additional project, so let’s already populate the solution file in the src folder and add the database project to it.

  1. In your PowerShell terminal, type in the following dotnet new sln -n PoweringUpDatabase

    Figure 6: Add solution file with dotnet command-line
  2. Add the database project by typing dotnet sln add PoweringUpDatabase
Figure 7: Add the database project to solution file

Cool! You’ve now created the basic folder structure. Try running dotnet build from the command line to see if the project builds.

Figure 8: Running dotnet build

Now before moving on to the next section, you want something to work with, when you deploy the database project. Make sure you populate the following folder structure.

Figure 9: Create the database objects

Inside the SuperHeroes.sql, add the following content:

CREATE TABLE [dbo].[SuperHeroes] 
( 
    [Id] INT NOT NULL PRIMARY KEY, 
    [Name] NVARCHAR(128) NOT NULL, 
    [Rating] SMALLINT NULL 
) 

For the stored procedure, add the following to retrieve the SuperHeroes:

CREATE PROCEDURE [dbo].[Add_SuperHero] 
    @SuperHero nvarchar(128), 
    @CoolnessRating smallint 
AS 
SELECT @SuperHero, @CoolnessRating 
RETURN 0 

When you use the standard database project format, you know that a file reference is added in the .sqlproj file. Now in the new .NET SDK style based, you will not see it, since it uses wildcard selection. Does this mean that the objects are still included? Well, check it out and run the dotnet build command line once more to see if it builds. It is still a bit difficult to see if these objects are added to your Data-Tier Application, and for that, you need to publish it. Let’s talk about how you can publish your Data-Tier Application locally.

Deploy locally before committing your code

One of the benefits of deploying locally, is the ability to test out the changes you make in the .sql scripts you created, isolated from others. Therefore, to publish your Data-Tier Application, you would need some command-line utility that helps you in doing so. This command-line utility is called SqlPackage. SqlPackage exposes some of the public Data-Tier Application Framework APIs based on tasks. One of these tasks, is called Publish. This task can incrementally update a database schema from a source .dacpac file. Add the SqlPackage command-line utility to your development environment by following the below steps.

  1. Open RequiredMOdules.psd1 file and add the following:
@{ 
    'SQLPackage_Download' = @{ 
        Name           = 'SqlPackage.zip' 
        DependencyType = 'FileDownload' 
        Source         = 'https://aka.ms/sqlpackage-windows' 
        Target         = "C:\Temp\SqlPackage.zip" # We have to hardcode the UNC path for WebClient object 
    } 
    'SQLPackage_Install'     = @{ 
        DependencyType = 'Command' 
        Source         = '$SqlPackagePath = "C:\Temp\"', 
        'if(Test-Path $SqlPackagePath\SqlPackage.zip){ 
            $ExtractPath = Join-Path -Path output\RequiredModules -ChildPath SqlPackage 
            New-Item -Path $ExtractPath -ItemType Directory 
            Expand-Archive -Path "$SqlPackagePath\SqlPackage.zip" -Destination $ExtractPath -Force 
       }' 
        DependsOn      = 'SQLPackage_Download' 
    } 
} 

Your RequiredModules.psd1 should now contain 3 dependencies defined, that can be retrieved and installed from the Resolve-Dependency.ps1 script.

Figure 10: Bootstrapping requirements

Run the script once again to see if the SqlPackage is fetched.

2. Run .\Resolve-Dependency.ps1 in your Windows PowerShell terminal

The end result creates an output folder with the required tools.

Figure 11: Bootstrap requirements resolved

You are now ready to publish your precious .dacpac file. But wait! Where do you want to deploy it against? You don’t have anything set up for it yet. There are plenty of tools out there that can easily spin up a SQL Server instance. You might create an Azure SQL database, if you’re developing for Azure SQL. You might install Docker, retrieve an image that includes a full fletched SQL Server Developer edition and build it. But in this case, enter the world of LocalDB.

LocalDB is a feature within the SQL Server Express family that primarily targets developers and is available in the SQL Server Express edition as standalone installer. LocalDB installation has a minimal set of files necessary to start the SQL Server Database engine. Once LocalDB is installed, you can initiate a connection using a special connection string. When you’re connecting to LocalDB, the infrastructure automatically creates and starts, enabling the developer to use the database instantly without complex configurations.

Therefore, let’s revisit the RequiredModules.psd1 file once more, and add the installation of LocalDB. Keep in mind that you’ve to run the script as administrator, since you’re fetching an executable.

  1. Open the RequiredModules.psd1 file
  2. Add the following lines just underneath the SQLPackage_Install dependency
'SQLExpress_Download' = @{ 
    Name           = 'SQL2019-SSEI-Expr.exe' 
    DependencyType = 'FileDownload' 
    Source         = 'https://go.microsoft.com/fwlink/?LinkID=866658' 
    Target         = 'C:\Temp\SQL2019-SSEI-Expr.exe' 
} 
'SQLExpress_Install'  = @{ 
    DependencyType = 'Command' 
    Source         = '$SqlExpress = "C:\Temp\"', 
    'if(Test-Path $SqlExpress\SQL2019-SSEI-Expr.exe){ 
        $ExtractPath = Join-Path -Path output\RequiredModules -ChildPath SQL2019-SSEI-Expr 
        New-Item -Path $ExtractPath -ItemType Directory 
        $Path = (Resolve-Path $ExtractPath).Path 
        $Arguments = "/Action=Download", "MEDIAPATH=$Path", "MEDIATYPE=LocalDb", "/QUIET" 
        Start-Process -FilePath "$SqlExpress\SQL2019-SSEI-Expr.exe" -ArgumentList $Arguments -NoNewWindow -Wait 
        $LocalDb = (Get-ChildItem $Path -Filter "sqlLocalDb.msi" -Recurse).FullName 
        Start-Process -FilePath msiexec -ArgumentList "/i $LocalDb", "IACCEPTSQLLOCALDBLICENSETERMS=YES", "/qn" -Wait 
    }' 
    DependsOn      = 'SQLExpress_Download' 
} 

3.    Run .\Resolve-Dependency.ps1 once more but this time from an elevated prompt 

When you check out the Program and Features from the Control Panel, you’ll notice that LocalDB has been installed successfully.

Figure 12: LocalDB installed on your local environment

Alright, you can now give SqlPackage.exe a go.

  1. Make sure your terminal’s location is pointing in the SqlPackage folder
  2. Execute .\sqlpackage.exe /Action:Publish /SourceFile:”C:\repos\PoweringUpDatabase\src\PoweringUpDatabase\bin\Debug\PoweringUpDatabase.dacpac” /TargetConnectionString:”Server=(localdb)\MSSQLLocalDB;Integrated Security=true;Initial Catalog=PoweringUpDatabase;” in the terminal
Figure 13: SqlPackage.exe publishing action

3. To verify the results, in your Management Panel select SQL Server -> Add Connection -> (localdb)\MSSQLLocalDB and connect with Integrated. 

Now you’ll see that the PoweringUpDatabase has been deployed locally, and the objects are created. Sweet, you are getting the hang of it.

Figure 14: Verify the database objects

Have the option to test locally

Databases play a critical part of the overall application. On software systems, data travels back and forth through a UI or some API that exposes CRUD (Create, Retrieve, Update and Delete) operations. It becomes crucial to validate the business rules that you create in your database, whether it’s a function you create, a stored procedure or trigger etc.

There are various ways you can test databases, but the most basic one, is unit testing. The primary goal of unit testing is to take the smallest piece of testable code, isolate it from the remainder of the code, and determine whether it behaves exactly as expected. Some properties of unit testing:

  • Unit tests are automated and executed by a unit test framework
  • Unit tests run relatively fast as they are isolated and don’t require a bunch of application settings or variables to be setup
  • Unit tests are deterministic, meaning if you have a stored procedure that multiplies to values, it should always run predictably

Compared to other programming languages, there aren’t many testing frameworks for SQL. One of the most widely used frameworks in SQL unit testing, is the tSQLt framework. tSQLt allows you to implement unit tests with the same language (T-SQL) you write your database with, so you don’t have to switch between various tools to create the code. Coming directly from the tSQLt home page, the following features are provided to create and make unit tests:

  • Tests are automatically run within transactions
  • Tests can be grouped together within a schema
  • Tests can be generated in plain text or XML
  • The frameworks gives you the ability to create fake tables and views

In previous steps, you should already have created the solution file. You now can add the test database project side-by-side and include it in your solution file by following these steps.

  1. Select Database Project extension in Visual Studio Code and click the + button
  2. Database Project Type will be the same as the main project
  3. Provide the name, in this blog PoweringUpDatabase.Tests is used
  4. Select the src folder
  5. Select SQL Server 2019 version
  6. Turn on the SDK-style preview

When you have followed the steps above, you should have the results shown in the image below.

Figure 15: Test database project added in the src folder

Don’t forget to add the test database project to the solution file by running dotnet sln add PoweringUpDatabase.Tests.

With this setup, you’re isolating your test database project from the main one. Before you can start using tSQLt, there are some requirements that needs to be set up, since you still want a reference to your main database projects where all your objects are created, and the testing objects created by the tSQLt framework.

  1. Download the tSQLt framework from the tSQLT page
  2. Extract the content with your favorite archiving program
  3. Grab the tSQLt.2019.dacpac file and copy it in the tSQLt folder in your test database project

    Figure 16: tSQLt framework Data-Tier Application
  4. Open the Database Project extension -> Right click Database References on the test database project and click Add Database Reference
  5. Select Data-Tier application (.dacpac)
  6. Select Same database
  7. Browse the folder and select the tSQLt.2019.dacpac file
  8. Make sure you suppress errors caused by unresolved references

This makes a direct reference to the tSQLt .dacpac file. If you open your .sqlproj file, you’ll see that the reference is added.

 

<ItemGroup> 
    <ArtifactReference Include="tSQLt\tSQLt.2019.dacpac"> 
        <SuppressMissingDependenciesErrors>True</SuppressMissingDependenciesErrors> 
    </ArtifactReference> 
</ItemGroup> 

Make sure that you do the same for the main database project, which leads to the following reference.

<ItemGroup> 
    <ProjectReference Include="..\PoweringUpDatabase\PoweringUpDatabase.sqlproj"> 
        <Name>PoweringUpDatabase</Name> 
        <Project>{GUID}</Project> 
        <Private>True</Private> 
        <SuppressMissingDependenciesErrors>True</SuppressMissingDependenciesErrors> 
    </ProjectReference> 
</ItemGroup> 

Alright, it’s time to create some unit tests in your test database project.

  1. Create a new folder called Tests
  2. Create Add_SuperHero.sql file in the Tests folder with the following content to populate the schema
CREATE SCHEMA [Add_SuperHero]  
    AUTHORIZATION dbo 
GO 
EXECUTE sp_addextendedproperty @name = 'tSQLt.TestClass', @value = 1, @level0type = 'SCHEMA', @level0name = 'Add_SuperHero' 

Create test Add_SuperHero returns the inputs.sql file to test out the stored procedure and add the following T-SQL snippet 

CREATE PROCEDURE [Add_SuperHero].[test Add_SuperHero returns the inputs] 
AS 
DECLARE @SuperHero AS NVARCHAR(50) = 'Superman'; 
DECLARE @CoolnessRating AS SMALLINT = 6; 
 
IF OBJECT_ID('actual') IS NOT NULL DROP TABLE actual; 
IF OBJECT_ID('expected') IS NOT NULL DROP TABLE expected; 
 
CREATE TABLE expected 
( 
    SuperHero varchar(50), 
    CoolnessRating int 
); 
 
CREATE TABLE actual 
( 
    SuperHero varchar(50), 
    CoolnessRating int 
); 
 
INSERT INTO expected 
    (SuperHero, CoolnessRating) 
VALUES 
    ('Superman', 6); 
 
INSERT INTO actual 
    (SuperHero, CoolnessRating) 
EXECUTE [dbo].[Add_SuperHero] @SuperHero, @CoolnessRating; 
 
EXECUTE tSQLt.AssertEqualsTable 'expected', 'actual'; 

If everything is setup appropriately, the end result in the Database Project extension looks like the image below.

Figure 17: Database references including unit tests

The last thing that is required before deploying the test database project, is to enable the CLR before the deployment actually happens. Therefore, you need to add a so called “pre-deployment” script. To do so, follow the steps below.

4. In the Database Project Extension, right-click the test database project -> Add Folder called PreDeployment
5. On the PreDeployment folder, right-click -> Add Pre-Deployment script called Script.EnableCLR.sql
6. Add the following content to the script

EXEC sp_configure 'show advanced options', 1; 
RECONFIGURE 
 
EXEC sp_configure 'clr strict security', 0; 
EXEC sp_configure 'clr enabled', 1; 
RECONFIGURE 
 
EXEC sp_configure 'show advanced options', 0; 
RECONFIGURE 

When you open the project file, you will notice that an ItemGroup is added.

Figure 18: Pre deployment element in project file

You can now build the test database project and deploy it with the SqlPackage.exe. There’s a small caveat when you call the command-line utility. Since SqlPackage does not automatically see the composite elements, you’ve to tell it to include these. You can do this by adding the /p:IncludeCompositeObjects=True flag. See below for an example.

Figure 19: Include composite elements when publishing

It’s time to see the end result of all this hard work. If you open the SQL Server Extension and connect to LocalDB, you should notice now that two databases are deployed. You are now able to run the stored procedure exec.tSQLtRunAll to execute the unit tests.

Figure 20: Running all tSQLt unit test again test database

Conclusion

Microsoft.Build.Sql introduces a new modern way of developing SQL Database projects. Even though the extension is still in preview, you can already see major differences between the standard SSDT database project. You’ve covered quite some ground and learned about some great principles and techniques to set up your project from scratch, and prepare it for future changes. These principles and techniques are not scoped to only SQL Database development. You can use them in any software project, whether it is a greenfield project, or already an existing one.

tSQLt is a great framework to test databases. In this blog, you’ve primarily focussed on adding the framework around your solution. While you haven’t covered any ground on how to write proper unit tests, the tSQLt open-source community provides best practices on how to do so. In the resources section, you’ll find interesting links to digest more around this topic.

See you in the next part in this series. No spoilers ahead!

References

 

Grow your career in IT

About the author

Gijs Reijn
Cloud Engineer

Gijs Reijn is the DevOps Engineer at Rabobank’s ALM IT department. He primarily focusses on Azure DevOps, Azure and loves to automate processes including standardization around it. Outside working hours, he can be found in the early morning working out in the gym nearly every day, writes his own blog to share knowledge with the community and reading upon new ideas.

Related articles

Writing documentation as a champ in engineering teams

  • 28 November 2022
  • 7 min
By Gijs Reijn

Sharing PowerShell code internally with module development

  • 13 December 2022
  • 10 min
By Gijs Reijn

Continuous Delivery series part two

  • 24 July 2023
  • 10 min
By Gijs Reijn