Publish automatically to Power BI environments with Azure DevOps pipelines
This is Part 3 of a series about creating a professional developer experience for working with Power BI. If you haven't already seen the first, you may prefer to start there.
In the first post in this series, I built an Azure DevOps pipeline to automate steps in a Power BI development workflow. The pipeline implemented a very basic workflow – as soon as a developer committed a new report version to Git, the pipeline deployed it immediately into a Power BI workspace.
In this post I'll be building a pipeline to support a more sophisticated workflow that enables peer review and stakeholder testing.
Environments
A key requirement for internal and user acceptance testing is the ability to share development work with developer colleagues and business stakeholders before it is deployed into production. This is typically achieved using different environments – implemented in Power BI as different workspaces – to contains reports at different stages of development.
In the first post of this series, I deployed a sample report to a workspace called AdventureWorks Reports. In this post I'll deploy the report to three workspaces at different points in the development workflow:
- First, to AdventureWorks Reports [Test], to share in-flight development work with report developer colleagues for internal review
- then to AdventureWorks Reports [UAT], to share completed developments with business stakeholders for user acceptance testing
- finally to AdventureWorks Reports, the production workspace where end-users access the report for routine use.1)
A set of three environments is what I will need to support my workflow, but this pattern is flexible enough to handle any number of environments 😊. I've provided a four-environment example later in this post.
Workflow
Let's imagine a report development team that – in coordination with the business it serves – works like this:
- The business has a variety of new and evolving reporting requirements.
- Requirements are defined as a prioritised backlog of work items (pieces of report development work).
- A package of agreed work items is delivered to the business at regular intervals, perhaps every week or two. This might be the duration of a single sprint for a scrum team, or simply a release cadence which suits the business.
A critical feature of this workflow is that a regular release cycle for new developments has been agreed with the business. Smoother workflow makes a better developer experience, but turning that into increased customer satisfaction needs a collaborative approach with stakeholder partners.
Git branches
In this workflow, I'll be using Git branches to manage delivery of work items and preparation of releases:
main
defines the authoritative set of Power BI reports and datasets currently in production (i.e. in use by the business).- Branches with names beginning
rc/
are release candidates. Anrc/
branch is created – copied frommain
– when a new release cycle (e.g. a sprint) begins. - Individual work items are developed in separate feature branches, isolated from
main
andrc/
branches. When a piece of development is ready for review by business stakeholders, its feature branch is merged into the release candidate branch.
The effect of managing branches like this is to accumulate development work in the current rc/
branch, until the release is ready to go. When it is time for updates to be released into production, the rc/
branch is merged into main
.
This workflow is based on one described by Mathias Thierbach (b|t) at SQLBits 2023 (also at PASS 2022). Microsoft's Enterprise content publishing usage scenario presents a different workflow using similar tools.
There is no single “correct” workflow – you need to find a process that delivers the developer experience, governance controls and business support that you need.
Developer workflow
Operationally, delivery of a single work item (e.g. a change to one report) follows the workflow illustrated below.
Hover over text in the diagram for more information about each stage.

The developer selects a work item scheduled for inclusion in the next release and creates a new feature branch in which to make report changes.
As work on the feature continues, the dev regularly commits changes to the feature branch and pushes them to the central repo.
When the dev thinks the changes are ready, they open a pull request (PR) to merge the feature branch into a previously created release candidate (rc/
) branch – this branch is used to accumulate changes ready for publication in the next release.
Opening a pull request into an rc/
branch causes the deployment pipeline to publish the report to a testing workspace in the Power BI service.
Colleagues in the development team review the modified report – in the testing workspace – and may suggest improvements.
If required, the dev makes improvements in the same feature branch, committing and pushing them as normal – which also automatically updates the PR. Every update to the PR causes the deployment pipeline to re-publish the revised report into the testing workspace, ready for another review.
When no more revisions are needed, a developer colleague approves the PR. Despite the name, this isn't about being “given permission” – it's just good practice not to mark your own homework 😃. The PR can now be completed, merging the feature branch into the rc/
branch.
Merging changes into an rc/
branch causes the deployment pipeline to publish the report to a UAT workspace in the Power BI service.
Report owners and other business stakeholders review the proposed changes in the UAT workspace, and provide feedback.
If stakeholder feedback means that more revisions are necessary, a new feature branch is created from the rc/
branch. The workflow continues as for any new feature.
When stakeholders approve the report version in the UAT workspace, no further modifications are required. The finished report development remains in the rc/
branch, ready for its eventual release.
Automatic deployment to production
As time goes on, completed work items accumulate in the release candidate rc/
branch. Finally, on the release schedule agreed with the business, the rc/
branch is merged into the main
branch.
Merging an updated report into main
from the release candidate branch causes its deployment pipeline to publish the report automatically into its production workspace, ready for use by stakeholders.
Automated deployment
The features of this workflow don't require any technical capability we haven't already encountered – I'm deploying a Power BI report using an Azure DevOps pipeline, just as in the first post in this series.
However, deployment decisions need to be a bit more sophisticated:
- The pipeline has to run automatically at a number of different milestones in the development workflow.
- The Power BI workspace where the report is published varies, depending on the workflow milestone where the pipeline runs.
In this section I'll extend the pipeline to handle these changes, starting with PowerShell resources used by the pipeline.
The report I'll be deploying contains the same visuals as my original sample report, but it uses a separate, standalone dataset. For now, every published report – one in each environment – will use the same shared dataset. I look at managing multiple dataset environments later in the series.
PbiDeployment module
I introduced the PbiDeployment
module briefly in the last post. Its purpose is to wrap up functionality I want to re-use, and to allow some of the detail to be factored out of the deployment script – making the script a bit easier to read. I'll be extending the module in this and future posts.
Here's the version of the module I'm using in this post:
function Use-Pbi([string[]]$WithModules = @('Profile')) { # install PowerShell modules for Power BI foreach($module in $WithModules) { Install-Module -Name "MicrosoftPowerBIMgmt.$module" -AllowClobber -Force -Scope CurrentUser } # log into Power BI $secureClientSecret = ConvertTo-SecureString $Env:AZURE_CLIENT_SECRET -AsPlainText -Force $credentials = New-Object PSCredential($Env:AZURE_CLIENT_ID, $secureClientSecret) Connect-PowerBIServiceAccount -Tenant $Env:AZURE_TENANT_ID -ServicePrincipal -Credential $credentials | Out-Null Write-Host "Connected to Power BI" $credentials } function Get-PbiWorkspaceId([string]$Name) { $workspaces = Get-PowerBIWorkspace -Name $Name if($workspaces.Count -eq 0) { $workspace = New-PbiWorkspace -Name $Name } elseif ($workspaces.Count -eq 1) { $workspace = $workspaces[0] } else { throw "Found $($workspaces.Count) workspaces named $Name!" } Write-Host "Workspace ID = $($workspace.Id)" $workspace.Id } function New-PbiWorkspace([string]$Name) { Write-Host "Creating workspace $Name" $workspace = New-PowerBIWorkspace -Name $Name Add-PowerBIWorkspaceUser ` -Workspace $workspace ` -AccessRight "Admin" ` -Identifier "$Env:PBI_WORKSPACE_ADMINS" ` -PrincipalType "Group" $workspace }
The module now contains three functions:
Use-Pbi
(line 1), which we met in the last post, loads PowerShell modules for Power BI and authenticates against the Power BI service.Get-PbiWorkspaceId
(line 18) takes the name of a workspace and returns its ID. If the workspace does not exist, it callsNew-PbiWorkspace
to create it.New-PbiWorkspace
(line 33) creates a new workspace with a specified name. It also adds a specified AAD group to the workspace's admin role – this ensures that members of a nominated workspace adminstrators group gain access automatically to every new workspace.The AAD group's ID is provided in an environment variable called
PBI_WORKSPACE_ADMINS
– the deployment pipeline will make this variable available via the PowerShell script task.
PowerShell script
Here's the updated PowerShell script for report deployment, to be be called by the report's deployment pipeline. It's similar to the one I used to deploy the report previously, but:
- it uses the
Use-Pbi
function (line 21) to gain access to Power BI - it uses the new
Get-PbiWorkspaceId
function (line 29) (instead of the nativeGet-PowerBIWorkspace
cmdlet I used last time). This abstracts away the detail of checking for workspace existence – and creating a workspace if necessary – into thePbiDeployment
module.
param( [Parameter(Mandatory = $true)] [ValidateNotNullOrEmpty()] [string] $ReportName, [Parameter(Mandatory = $true)] [ValidateNotNullOrEmpty()] [string] $WorkspaceName, [Parameter(Mandatory = $true)] [ValidateNotNullOrEmpty()] [string] $PbixFilePath ) $scriptFolder = $MyInvocation.MyCommand.Path | Split-Path Import-Module $scriptFolder\PbiDeployment\PbiDeployment.psm1 -Force # Connect to Power BI Use-Pbi -WithModules @('Workspaces', 'Reports') | Out-Null # publish the report Write-Host "Deploying $ReportName to workspace $WorkspaceName" Write-Host $PbixFilePath New-PowerBIReport ` -Path $PbixFilePath ` -Name $ReportName ` -WorkspaceId (Get-PbiWorkspaceId -Name $WorkspaceName) ` -ConflictAction CreateOrOverwrite
Pipeline definition
Finally – the revised pipeline definition. The pipeline's steps
haven't changed much – it still contains one PowerShell task, calling the deployment script, but notice that it's now passing the PBI_WORKSPACE_ADMINS
environment variable required by the New-PbiWorkspace
function (line 46). Like the Azure credentials used to connect to Power BI, the value for the variable is being supplied by a secret pipeline variable, PowerBiWorkspaceAdmins
.
trigger: branches: include: - rc/* - main paths: include: - powerbi-pro-devex-series/03-ReportEnvironments/ExecutiveSummary.pbix pr: branches: include: - rc/* paths: include: - powerbi-pro-devex-series/03-ReportEnvironments/ExecutiveSummary.pbix variables: - name: folderPath value: $(System.DefaultWorkingDirectory)/powerbi-pro-devex-series/03-ReportEnvironments - name: workspaceName ${{ if eq(variables['Build.SourceBranch'], 'refs/heads/main') }}: value: 'AdventureWorks Reports' ${{ elseif eq(variables['Build.Reason'], 'PullRequest') }}: value: 'AdventureWorks Reports [Test]' ${{ elseif startsWith(variables['Build.SourceBranch'], 'refs/heads/rc/') }}: value: 'AdventureWorks Reports [UAT]' pool: vmImage: ubuntu-latest steps: - task: PowerShell@2 displayName: Publish Power BI report inputs: targetType: filePath filePath: $(folderPath)/Deploy-PbiReport.ps1 arguments: > -ReportName "Executive Summary" -WorkspaceName "$(workspaceName)" -PbixFilePath "$(folderPath)/ExecutiveSummary.pbix" failOnStderr: true env: AZURE_TENANT_ID: $(AzureTenantId) AZURE_CLIENT_ID: $(AzureClientId) AZURE_CLIENT_SECRET: $(AzureClientSecret) PBI_WORKSPACE_ADMINS: $(PowerBiWorkspaceAdmins)
The main difference between this pipeline definition and the previous version is in the trigger
, pr
and variables
definitions – together these control when the pipeline runs, and the Power BI workspace where the report will be published:
The
pr
option (lines 9-15) specifies pipeline runs to takes place when a developer opens a PR for internal review (deploy to test). The pipeline will run:- when a pull request is opened
- when the target branch for that pull request is a branch with a name beginning
rc/
when the pull request includes a change to my
ExecutiveSummary.pbix
file
YAML pr triggers are available in GitHub but aren't supported in Azure Repos Git – instead, use a build validation branch policy to trigger a pipeline run when a PR is created. I'll come back to these in a later post.
The
trigger
option (lines 1-8) specifies pipeline runs required when branches are merged into anrc/
branch (deploy to UAT) or themain
branch (deploy to production). The pipeline will run:- when a change is made to my
ExecutiveSummary.pbix
file in either the
main
branch, or in a branch with a name beginningrc/
The trigger
and pr
sections control when the pipeline will run, but don't change where the report will be published. To manage that aspect, the variables
section defines a workspaceName
variable, used to specify the target Power BI workspace. The value of the variable is set using conditional insertion expressions (lines 20-26):
- If the pipeline is run from the
main
branch – detected using a pre-defined pipeline variable – the workspace name is set to “AdventureWorks Reports”. - Otherwise, if the pipeline is running is response to a pull request, it is set to “AdventureWorks Report [Test]”.
- Otherwise, if the pipeline is run from a branch with a name beginning
rc/
, it is set to “AdventureWorks Report [UAT]”.
This set of conditions, in this order, ensures that the right target workspace is selected for each deployment point in the workflow.
See it in action
This video shows the workflow in action, from creating a feature branch to make a change, through to its eventual release into production.
A different set of environments
I claimed earlier that this pattern is flexible enough to support any number of environments. The exact mechanism for doing so depends on your required workflow, but in essence you can:
- use additional pipeline triggers to initiate publishing at more points in your Git workflow
- use different combinations of conditions to determine the corresponding Power BI workspace.
As a simple example, I could extend the workflow I presented here by additionally publishing to a [Dev] workspace every time I push to a feature branch. (The intention here might be to allow me to see my Power BI report online as I work, before opening any pull request).
To achieve this, I now need to trigger the pipeline on a push to any branch (just as in the first post in the series):
trigger: branches: include: - '*' paths: include: - powerbi-pro-devex-series/01-FirstPipeline/ExecutiveSummary.pbix
Workspace selection is extended with an else
clause in the conditional insert expression:
variables: - name: workspaceName ${{ if eq(variables['Build.SourceBranch'], 'refs/heads/main') }}: value: 'AdventureWorks Reports' ${{ elseif eq(variables['Build.Reason'], 'PullRequest') }}: value: 'AdventureWorks Reports [Test]' ${{ elseif startsWith(variables['Build.SourceBranch'], 'refs/heads/rc/') }}: value: 'AdventureWorks Reports [UAT]' ${{ else }}: value: 'AdventureWorks Reports [Dev]'
The conditional insertion expression now deploys to Test, UAT and Production workspaces as previously, but in all other cases publishes to a fourth workspace AdventureWorks Reports [Dev].
Summary
In this article, I developed a pipeline that automatically deploys a Power BI report into different environments in response to certain actions in the developer workflow. This is much closer to the example workflow I suggested in the first post in the series 😊.
The pipeline only deploys one report – my Executive Summary sample report – but in the real world I could be managing a very large number of reports. In the next post I'll look at restructuring the Azure DevOps pipeline to enable precise report deployment with minimal configuration.
Next up: In the next post, I think about how to manage multiple Power BI reports using Azure DevOps pipelines.
Code: The code for the series is available on
Github. The files specific to this article are in the
powerbi-pro-devex-series/03-ReportEnvironments
folder. You won't be able to modify the supplied report definition (unless you point it at your own shared dataset), but that shouldn't affect your ability to deploy it.Share: If you found this article useful, please share it!