Stitching it together: pipeline best practices for Azure Bicep
Finally, the third and last part of the best practices series on using Azure Bicep. You’ve gone through a lot of content so far, if you are following along this series on best practices for Azure Bicep.
In the first part of the series, you’ve learned 5 best practices for using Azure Bicep in general. In the second part of the series, you’ve gone through testing best practices for Azure Bicep, including:
- Learning about the testing pyramid
- Applying compliancy ruling before deploying Azure resources
- Exploring Pester and applying integration testing
In this last part of the series, you’ll learn how you can integrate everything into Azure Pipelines, including the creation of YAML templates and consuming the reporting results you’ve produced locally already. Be prepared! One last time: grab your seatbelt and tighten it. This is not for beginners, as you’re going to cover a lot of advanced YAML in this final series. Let’s get started with a quick recap first.
A quick recap
Your repository should contain a new storage account resource including testing shown in figure 1.

Because you have already prepared a ton in the build automation script, you’ve reached the stage where you can integrate with Continuous Integration/Continuous Delivery (also known as CI/CD) systems. In this final part, as already mentioned, you’re going to use Azure DevOps Services. So, if you want to proceed, you’re going to need an Azure DevOps organization. If you already have an organization available, awesome! You can proceed!
Templates, templates and did I already say templates?
Azure DevOps Services has the capability to let you define templates in YAML. Templates have these benefits:
- They improve the consistency to describe deployments
- They help expressing complexity
- They reduce manual and error-prone tasks
- They are code, so can be stored in version control
- They promote reusability
That already answers most of the puzzle of why you should use templates, but why should you use it in combination with Azure Bicep? As you’ve already added two resources to your repository, this is going to be expanded in the future. You want to make it as efficient as possible for newcomers to follow your practices on building Azure Bicep and eventually publish it to your teams or organization. If you already described the complexity itself based on templates, you can simply re-use those templates when a new resource is added to the repository. Let’s just see what that looks like in code.
Creating the build template stage
In the build automation script, you should have your BuildBicep task that is responsible for building your Azure Bicep files. Azure DevOps Services allows you to run that piece of PowerShell on so called Azure Pipelines agents. In this example, you can see that the Microsoft-hosted agents are being used, but if you have your own self-hosted agents, you can also target these. Agents offer several virtual machine images, each including a broad range of software.
Having said that, let’s add a new folder in the root of the repository called pipelines. You can create the file and folder structure as shown in figure 2.

In the build-bicep-task.yml, the following content is added to build a specific Azure Bicep resource:
parameters:
- name: templatePath
type: string
default: '$(System.DefaultWorkingDirectory)\src'
- name: bicepBuild
type: string
steps:
- task: PowerShell@2
displayName: 'Build ${{ parameters.bicepBuild }} Bicep file'
inputs:
targetType: 'inline'
script: |
$Params = @{
Task = 'BuildBicep'
TemplatePath = "${{ parameters.templatePath }}\${{ parameters.bicepBuild }}"
}
Invoke-Build @Params
Inside the install-single-module-task.yml, the InvokeBuild module is installed. You can ask a clever question: “why not use the bootstrap script for it?”. The bootstrap script installs more modules than required, increasing the time to run the stage. Don’t worry, you’re going to use it later.
parameters:
- name: moduleName
type: string
default: 'InvokeBuild'
steps:
- task: PowerShell@2
displayName: 'Install single module'
inputs:
targetType: 'inline'
script: Install-Module -Name "${{ parameters.moduleName }}" -Force -Scope CurrentUser
Lastly, for the task section, you need to publish the produced json file back as artifact in the publish-pipeline-artifact-task.yml.
parameters:
- name: artifactName
type: string
default: json
- name: targetPath
type: string
default: $(System.DefaultWorkingDirectory)\build
steps:
- task: PublishPipelineArtifact@1
displayName: 'Publish ${{parameters.artifactName}} artifact'
inputs:
artifactName: ${{parameters.artifactName}}
targetPath: ${{parameters.targetPath}}
Let’s bring it all together in one job inside the build-bicep-job.yml.
parameters:
- name: agent
type: string
default: windows-latest
- name: templatePath
type: string
default: '$(System.DefaultWorkingDirectory)\src'
- name: bicepBuild
type: string
- name: preBuildSteps
type: stepList
default:
- template: ../task/install-single-module-task.yml
- name: afterBuildSteps
type: stepList
default:
- template: ../task/publish-pipeline-artifact-task.yml
jobs:
- job: Build
pool:
vmImage: ${{ parameters.agent }}
steps:
- ${{ parameters.preBuildSteps }}
- template: ../task/build-bicep-task.yml
parameters:
templatePath: ${{ parameters.templatePath }}
bicepBuild: ${{ parameters.bicepBuild }}
- ${{ parameters.afterBuildSteps }}
You have now set up your template that can be used in Azure Pipelines as template reference! In the azure-pipelines.yml file, you can specify the resource that you want to target, in this case the storage account.
variables:
bicepBuild: storageaccount
stages:
- stage: Build
displayName: "Build"
jobs:
- template: ..\..\templates\job\build-bicep-job.yml
parameters:
bicepBuild: $(bicepBuild)
The result can be seen in figure 3 after you’ve added the YAML file in Azure Pipelines.

Capturing test results in Azure Pipelines with NUnit reports
You want to know as soon as possible when something is wrong with your code, just like you’ve done already locally. You also want to make sure that for each change you make in the code, unit testing is run after the build is completed. Sometimes it can be forgotten by developers to run the testing locally, whereas with Azure DevOps Services, you can always specify your own repository or build policy.
In the second part of the series, the ValidateBicep task already produced these so called NUnit reports. NUnit reports can be consumed by Azure Pipelines by the Publish Test Results task, which gives insight in how many tests have passed or failed. Unfortunately, the TestBicep task is not producing these reports. Let’s see if you can patch it up.
Generating reporting results for ARM-TTK
There is an extension available to run ARM-TTK tests, but you’ve already got your own task in the build automation script. Sam Cogan (Cloud Architect and Microsoft Azure MVP) made the project open source, so a huge thanks to him, as the source code falls under the MIT license. Grab this code and add the following folder shown in figure 4.

Modify the following lines:
$hash = Get-FileHash -Path $testFile -Algorithm MD5
$NunitXml | Out-File -FilePath "$Path\$($(get-item $testFile).basename)-$($hash.Hash)-armttk.xml" -Encoding utf8 -Force
With the following code to return the file path:
$hash = Get-FileHash -Path $testFile -Algorithm SHA256
$FilePath = "$Path\$($(get-item $testFile).basename)-$($hash.Hash)-armttk.xml"
$NunitXml | Out-File -FilePath $FilePath -Encoding utf8 -Force
return $FilePath
In the TestBicep task, add the following lines in the beginning to import that juicy module:
if (-not (Get-Command buildHelpers -ErrorAction SilentlyContinue))
{
Write-Build Yellow "Importing build helper module"
Import-Module "$BuildRoot\buildHelpers\buildHelpers.psm1"
}
Alter the foreach loop to set a variable that is set during when ran from the Azure Pipelines:
foreach ($Package in $Packages)
{
Write-Build Yellow "Testing against: $Package"
$Result = Test-AzTemplate -TemplatePath $Package -Skip "Variables-Must-Be-Referenced", "Template-Should-Not-Contain-Blanks" # Both variables are used and blanks cannot be handled with nested templates
$FileOutput = Export-NUnitXml -TestResults $Result -Path $TestDirectory
Write-Build Yellow "NUnit reported generated: $FileOutput"
if ($env:TF_Build)
{
Write-Host "##vso[task.setvariable variable=TestResultFile]$FileOutput"
}
}
To test it, you can run Invoke-Build -TemplatePath .\src\storageaccount\ -Task BuildBicep, TestBicep on your terminal. In figure 5, you can see that the NUnit report has been generated for the ARM-TTK.

Piew piew! Pure sweetness, as you now have both NUnit reports from the ARM-TTK and PSRule for Azure, as you’ve already made preparation for the PSRule for Azure module. Let’s add the templates for the unit test stage to combine both results.
Adding ARM-TTK YAML job template
Let’s first start by creating the ARM-TTK checker task, including publishing the results. In figure 6, you can see the files created.

In the download-pipeline-artifact-task.yml, you are downloading the json file to test against.
parameters:
- name: targetPath
type: string
default: '$(System.ArtifactsDirectory)'
- name: artifactName
type: string
default: json
steps:
- task: DownloadPipelineArtifact@2
displayName: 'Download ${{parameters.artifactName}} artifact'
inputs:
artifactName: ${{parameters.artifactName}}
targetPath: ${{parameters.targetPath}}
The arm-ttk-checker-task.yml is responsible for executing the TestBicep task in the build automation script.
steps:
- task: Powershell@2
displayName: 'Run ARM Test Toolkit'
inputs:
targetType: 'inline'
script: |
$Params = @{
Task = 'TestBicep'
BuildDirectory = '$(System.ArtifactsDirectory)'
}
Invoke-Build @Params
Finally, the publish-test-results-task.yml is responsible for publishing back the results.
parameters:
- name: testResultsFormat
type: string
default: 'NUnit'
- name: testResultsFiles
type: string
default: $(TestResultFile)
- name: searchFolder
type: string
default: $(System.DefaultWorkingDirectory)
- name: mergeTestResults
type: boolean
default: true
- name: failTaskOnFailedTests
type: boolean
default: true
- name: testRunTitle
type: string
default: 'Unit test results'
steps:
- task: PublishTestResults@2
displayName: 'Publishing ${{parameters.testResultsFormat}} results'
inputs:
testResultsFormat: ${{ parameters.testResultsFormat }}
testResultsFiles: ${{ parameters.testResultsFiles }}
searchFolder: ${{ parameters.searchFolder }}
mergeTestResults: ${{ parameters.mergeTestResults }}
failTaskOnFailedTests: ${{ parameters.failTaskOnFailedTests }}
testRunTitle: ${{ parameters.testRunTitle }}
If you notice, the testResultsFiles parameter defaults to the variable that is being set by the script, so you don’t have to worry about it. Let’s stitch that together in the arm-ttk-checker-job.yml.
parameters:
- name: agent
type: string
default: windows-latest
- name: preTestSteps
type: stepList
default:
- template: ../task/install-single-module-task.yml
- template: ../task/download-pipeline-artifact-task.yml
- name: afterTestSteps
type: stepList
default:
- template: ../task/publish-test-results-task.yml
jobs:
- job: ARMTTK
displayName: 'Check ARM template with TTK'
pool:
vmImage: ${{ parameters.agent }}
steps:
- ${{ parameters.preTestSteps }}
- template: ../task/arm-ttk-checker-task.yml
- ${{ parameters.afterTestSteps }}
Let’s modify our current pipeline to include a new unit test stage to see the results.
variables:
bicepBuild: storageaccount
stages:
- stage: Build
displayName: "Build"
jobs:
- template: ..\..\templates\job\build-bicep-job.yml
parameters:
bicepBuild: $(bicepBuild)
- stage: UnitTest
displayName: "Unit test"
jobs:
- template: ..\..\templates\job\arm-ttk-checker-job.yml
The pipeline should have been triggered and should have the report generated as you can see in figure 7.

If you drill deeper, you can see which tests have passed in the report. Awesome! Now for the PSRule for Azure module results.
PSRule for Azure YAML job template
As already mentioned, you should already have NUnit results published for the PSRule for Azure module. This makes it much easier to publish these results in Azure Pipelines. Now you might think: “should another stage be created?”. That’s quite a fair question, but the results fall under the unit testing stage. How can you make both available in Azure Pipelines?
That’s where parallel jobs come into play. When you define a pipeline, you can define it as a collection of jobs. If you have multiple parallel jobs at your disposal, they should both run at the same time, because they don’t depend on each other.
With this knowledge at hand, let’s introduce two more tasks and combine both jobs into a single one to describe the complexity. In figure 8, you can see the files added to the repository.

Inside the bootstrap-task.yml, you can now bootstrap the required modules to be installed. They are required in this case for PSRule for Azure.
steps:
- task: PowerShell@2
displayName: 'Install Powershell modules'
inputs:
targetType: 'inline'
script: |
.\bootstrap.ps1 -Bootstrap
# Install-Module -Name InvokeBuild -Force
The psrule-checker-task.yml executes the ValidateBicep task.
parameters:
- name: templatePath
type: string
default: '$(System.DefaultWorkingDirectory)\src'
- name: bicepBuild
type: string
default: $(bicepBuild)
steps:
- task: PowerShell@2
displayName: 'Build ${{ parameters.bicepBuild }} Bicep file'
inputs:
targetType: 'inline'
script: |
$Params = @{
Task = 'ValidateBicep'
TemplatePath = "${{ parameters.templatePath }}\${{ parameters.bicepBuild }}"
}
Invoke-Build @Params
You can do nearly the same with the arm-ttk-checker-job.yml for the psrule-checker-job.yml, only targeting the different tasks and grabbing the output as a variable.
parameters:
- name: agent
type: string
default: windows-latest
- name: templatePath
type: string
default: '$(System.DefaultWorkingDirectory)\src'
- name: bicepBuild
type: string
default: $(bicepBuild)
- name: preTestSteps
type: stepList
default:
- template: ../task/bootstrap-task.yml
- name: afterTestSteps
type: stepList
default:
- template: ../task/publish-test-results-task.yml
parameters:
testResultsFiles: $(PSRuleResultFile)
jobs:
- job: PSRule
displayName: 'Check PSRule for Azure'
pool:
vmImage: ${{ parameters.agent }}
steps:
- ${{ parameters.preTestSteps }}
- template: ../task/psrule-checker-task.yml
parameters:
templatePath: ${{ parameters.templatePath }}
bicepBuild: ${{ parameters.bicepBuild }}
- ${{ parameters.afterTestSteps }}
You can see now that the testResultsFiles parameter is defined with a different variable, so let’s update that one in the ValidateBicep task just after the results have been outputted.
if ($env:TF_Build)
{
Write-Host "##vso[task.setvariable variable=PSRuleResultFile]$OutputPath"
}
You should already have the unit-test-job.yml available but kept empty. You can now fill it in and specify both jobs.
parameters:
- name: agent
type: string
default: windows-latest
- name: templatePath
type: string
default: '$(System.DefaultWorkingDirectory)\src'
- name: bicepBuild
type: string
default: $(bicepBuild)
- name: beforeTtkScan
type: stepList
default:
- template: ../task/install-single-module-task.yml
- template: ../task/download-pipeline-artifact-task.yml
- name: afterTtkScan
type: stepList
default:
- template: ../task/publish-test-results-task.yml
jobs:
- template: arm-ttk-checker-job.yml
parameters:
agent: ${{ parameters.agent }}
preTestSteps: ${{ parameters.beforeTtkScan }}
afterTestSteps: ${{ parameters.afterTtkScan }}
- template: psrule-checker-job.yml
parameters:
agent: ${{ parameters.agent }}
templatePath: ${{ parameters.templatePath }}
bicepBuild: ${{ parameters.bicepBuild }}
afterTestSteps:
- template: ../task/publish-test-results-task.yml
parameters:
testResultsFiles: $(PSRuleResultFile)
That’s pretty neat, right?! Finally, you can update the azure-pipelines.yml file to:
variables:
bicepBuild: storageaccount
stages:
- stage: Build
displayName: "Build"
jobs:
- template: ..\..\templates\job\build-bicep-job.yml
parameters:
bicepBuild: $(bicepBuild)
- stage: UnitTest
displayName: "Unit test"
jobs:
- template: ..\..\templates\job\unit-test-job.yml
Creating the integration stage template
Learning from the testing pyramid in the second part of this series, the integration tests run after the unit test stage. Let’s add two more templates as seen in figure 9 to the repository.

You are already getting the hang of it, so let’s fill them in. In the integration-test-task.yml, you should target the IntegrationTest task.
parameters:
- name: azureSubscription
type: string
- name: templatePath
type: string
default: '$(System.DefaultWorkingDirectory)\src'
- name: bicepBuild
type: string
default: $(bicepBuild)
steps:
- task: AzurePowerShell@5
displayName: 'Run integration tests'
inputs:
azureSubscription: "${{ parameters.azureSubscription }}"
ScriptType: 'InlineScript'
Inline: |
$Params = @{
Task = 'IntegrationTest'
TemplatePath = "${{ parameters.templatePath }}\${{ parameters.bicepBuild }}"
}
Invoke-Build @Params
azurePowerShellVersion: 'LatestVersion'
From the integration-test-job.yml, you need Pester to be installed. Let’s use the bootstrap script for it and change the title for integration tests.
parameters:
- name: agent
type: string
default: windows-latest
- name: azureSubscription
type: string
- name: beforeIntegrationTests
type: stepList
default:
- template: ../task/bootstrap-task.yml
- name: afterIntegrationTests
type: stepList
default:
- template: ../task/publish-test-results-task.yml
parameters:
testResultsFiles: $(IntegrationResultFile)
testRunTitle: 'Integration test results'
jobs:
- job: IntegrationTest
displayName: 'Integration test'
pool:
vmImage: ${{ parameters.agent }}
steps:
- ${{ parameters.beforeIntegrationTests }}
- template: ../task/integration-test-task.yml
parameters:
azureSubscription: ${{ parameters.azureSubscription }}
- ${{ parameters.afterIntegrationTests }}
Again, make sure that you modify the build automation script to include that variable after the tests have run.
if ($env:TF_Build)
{
$OutputFile = Join-Path -Path $TestDirectory -ChildPath 'IntegrationResults.xml'
Write-Host "##vso[task.setvariable variable=IntegrationResultFile]$OutputFile"
}
You can now add the integration stage to your pipeline. Make sure that you have filled in your own service connection details.
variables:
bicepBuild: storageaccount
azureSubscription: "<serviceconnection>"
stages:
- stage: Build
displayName: "Build"
jobs:
- template: ..\..\templates\job\build-bicep-job.yml
parameters:
bicepBuild: $(bicepBuild)
- stage: UnitTest
displayName: "Unit test"
jobs:
- template: ..\..\templates\job\unit-test-job.yml
- stage: IntegrationTest
displayName: "Integration test"
jobs:
- template: ..\..\templates\job\integration-test-job.yml
parameters:
azureSubscription: $(azureSubscription)
Now, give that baby a kick (not literally of course) to see the results in figure 10.

Publishing modules through Azure Pipelines
If the build and all tests have run successfully, you should be ready to publish your hard work. Wrapping up the pipeline, you should know by now that the PublishBicep task is responsible for publishing the template specifications to Azure.
Therefore, you can add two more files, one called publish-bicep-task.yml with the following content:
parameters:
- name: resourceGroupName
type: string
- name: location
type: string
default: westeurope
- name: azureSubscription
type: string
steps:
- task: AzurePowerShell@5
displayName: 'Publish Bicep file'
inputs:
azureSubscription: ${{ parameters.azureSubscription }}
ScriptType: 'InlineScript'
azurePowerShellVersion: LatestVersion
Inline: |
$Params = @{
Task = 'PublishBicep'
BuildDirectory = '$(System.ArtifactsDirectory)'
ResourceGroupName = '${{ parameters.resourceGroupName }}'
Location = '${{ parameters.location }} '
}
Invoke-Build @Params
In the job folder, call it publish-bicep-job.yml and you can add the following to it:
parameters:
- name: agent
type: string
default: windows-latest
- name: resourceGroupName
type: string
- name: location
type: string
default: westeurope
- name: azureSubscription
type: string
- name: prePublishSteps
type: stepList
default:
- template: ../task/install-single-module-task.yml
- template: ../task/download-pipeline-artifact-task.yml
- name: afterPublishSteps
type: stepList
default: []
jobs:
- job: Publish
pool:
vmImage: ${{ parameters.agent }}
steps:
- ${{ parameters.prePublishSteps }}
- template: ../task/publish-bicep-task.yml
parameters:
resourceGroupName: ${{ parameters.resourceGroupName }}
location: ${{ parameters.location }}
azureSubscription: ${{ parameters.azureSubscription }}
- ${{ parameters.afterPublishSteps }}
Before you add the stage to your pipeline, you’re able to set the version number from the version that is fetched inside the build automation script and set it during pipeline execution. Make sure you add the following lines to the PublishBicep task:
if ($env:TF_Build)
{
Write-Host "##vso[build.updatebuildnumber]$Version"
}
This makes sure that every run is set by the build automation script. Now, for the full pipeline, the publish stage should be the last one to run.
variables:
bicepBuild: storageaccount
azureSubscription: "<serviceconnection>"
resourceGroupName: "<resourceGroupName>"
stages:
- stage: Build
displayName: "Build"
jobs:
- template: ..\..\templates\job\build-bicep-job.yml
parameters:
bicepBuild: $(bicepBuild)
- stage: UnitTest
displayName: "Unit test"
jobs:
- template: ..\..\templates\job\unit-test-job.yml
- stage: IntegrationTest
displayName: "Integration test"
jobs:
- template: ..\..\templates\job\integration-test-job.yml
parameters:
azureSubscription: $(azureSubscription)
- stage: Publish
displayName: 'Publish template'
jobs:
- template: ..\..\templates\job\publish-bicep-job.yml
parameters:
azureSubscription: $(azureSubscription)
resourceGroupName: '$(resourceGroupName)'
Testing out the pipeline, you should now see 4 stages as seen in figure 11. The version number is also set from the templateSpecVersion. You have done a great job!

Conclusion
That really was some pretty advanced stuff right there. You can now safely remove your seatbelt. If you made it till the end, you have survived the last part of this series on best practices for Azure Bicep. By now, you should have figured out that these templates can easily be re-used when new Bicep resources are added to the repository. The only thing that you have to change is the variable to build the specific resource when creating new pipelines for these resources. The folder structure speaks for itself.
With this new knowledge, can you help your team members by deploying these template specifications through Azure Pipelines? My bet is that you can! Good luck.