Reusable deployment pipelines for Power BI
This is Part 4 of a series about creating a professional developer experience for working with Power BI. If you haven't already seen the first, you may prefer to start there.
In the first post in this series I looked at publishing a Power BI report automatically, then in the previous post I created a pipeline to promote a report through a series of Power BI workspaces (corresponding to user environments).
But what happens when – of course – you need to manage many reports? You could:
- Use a single pipeline to deploy every report, republishing everything any time one report is modified.
- Build one pipeline per workspace (behaviour similar to Power BI deployment pipelines).
- Implement one pipeline per report.
The pipeline-per-report model offers precisely targeted deployments, because a report is only ever published when it has been modified. If you want to see how that might look in practice, in this video (5m26s) I update a few reports – using the workflow I described in the previous post – and let their deployment pipelines take care of everything else:
Creating one pipeline per report means that each report is only deployed when it has changed. To be really clear, this means that:
- any changed report is published automatically
- every changed report is published automatically
- a report is only published if it has changed
…and all this happens without you doing anything special after initial report creation. This supports precisely targeted deployments – you publish only the reports that need to be published, no more, no less.
Implementation of one pipeline per report makes additional demands of a developer when creating a new report. To make this easier to manage, in this post I look how to make pipeline creation as simple as possible, by building each pipeline from a set of reusable components.
Simplify pipeline implementation
Simplifying pipeline implementation isn't just about making life easier for a developer today, but also in the future. What happens if my development workflow changes? What happens if I introduce additional environments? When required, I want the flexibility to accommodate changes with minimum effort.
Implementing common features of a pipeline as reusable components makes this objective easier to achieve. When an element of pipeline functionality needs to change, I can update the corresponding component in one place, but modify the behaviour of every pipeline. To factor out common components I will:
- Share secret variables between pipelines by storing them in a variable group.
- Extract non-secret report configuration variables into a variable template.
- Extract pipeline execution details into a reusable template.
Share secret variables
Secret variables such as deployment credentials must be stored securely, outside version control systems. In previous posts I have defined secret variables for use by a single pipeline, using the Azure DevOps UI – this approach is undesirable when multile pipelines are required, because secrets must be maintained separately for every pipeline.
A better alternative is to create secret variables in a variable group. Variable groups can be referenced by multiple pipelines, enabling the variables they contain to be used by any referencing pipeline.
Variable groups are created in the “Pipelines” area of the Azure DevOps UI – choose “Pipelines” → “Library” in the sidebar menu, then click “+ Variable group”:
When creating a new group you can choose either to specify secret variables directly, or to link the group to an Azure Key Vault. Linking to a key vault is a good choice if you are already using the key vault service to manage secrets.
Once you have created a group, added at least one variable, and saved it, the option to set Pipeline permissions becomes available:
You can allow individual pipelines to access the variable group, or can open access to all pipelines in the project. “Open access” need not be insecure if you have a project dedicated to Power BI report management, and is simpler than having to grant access explicitly to every new report pipeline.
When creating secret variables in the group, make sure you use the lock button (padlock icon) to mark the variable secret.
Manage report configuration
An Azure DevOps template is a reusable, parameterisable YAML file that specifies part of a pipeline that can be used by other pipelines. Templates can be used to specify variables, job steps, jobs or entire pipeline stages.
For ease of configuration, I'm going to extract report-specific pipeline variables for each report into its own separate variables template file, named metadata.yaml
. Strictly speaking this isn't necessary, because each report needs its own pipeline YAML file (even if only a basic one), but it cleanly separates configuration data from deployment code.
Here's the metadata.yaml
file for my Executive Summary sample report:
variables: reportName: 'Executive Summary' description: | A summary of sales performance for company executives. workspaceName: 'AdventureWorks Reports' isDeleted: false
Notice that it includes the report's display name, and the workspace into which it is published – details which were previously hard-coded into the pipeline definition. It also includes two new variables:
description
provides a description of the report. You might not need to use this in the pipeline – in fact I won't. Including it here illustrates that you could use report metadata to store any report information in version control, even if it's not needed by the pipeline.- setting
isDeleted
to true will allow me to “unpublish” reports (delete them from Power BI workspaces) – a revised PowerShell deployment script is included in the files on GitHub accompanying this post.
Storing report metadata in configuration files is good practice because it can be version controlled 😀. Don't be tempted to store secret values in version control, because doing so is insecure – use variable groups for these!
Notice that a report now has three components – its PBIX file, a pipeline.yaml
pipeline definition file and now the metadata.yaml
configuration file. To make this easier to manage, I'm going to organise my report files in version control using a common structure:
Report-related files are stored under the top level “reports” folder (indicated in the screenshot). Each report has its own subfolder, containing three files which always have the same names:
- the report's PBIX file,
Report.pbix
- the deployment pipeline definition,
pipeline.yaml
- report configuration settings,
metadata.yaml
.
Use a pipeline template
This is a YAML stages template, based on the deployment pipeline I've been using in previous posts. You can tell it's a template because it specifies a parameter (lines 1-3) called reportFolder
– a value for this parameter must be provided by any YAML pipeline that uses the template.
parameters: - name: reportFolder type: string stages: - stage: Jobs variables: - group: DeploymentSecrets - template: ${{ parameters.reportFolder }}/metadata.yaml jobs: - job: Job displayName: Publish Power BI report pool: vmImage: ubuntu-latest variables: - name: targetWorkspace ${{ if eq(variables['Build.SourceBranchName'], 'main') }}: value: '$(workspaceName)' ${{ elseif eq(variables['Build.Reason'], 'PullRequest') }}: value: '$(workspaceName) [Test]' ${{ elseif startsWith(variables['Build.SourceBranch'], 'refs/heads/rc/') }}: value: 'AdventureWorks Reports [UAT]' steps: - task: PowerShell@2 displayName: Publish report inputs: targetType: filePath filePath: "$(System.DefaultWorkingDirectory)/${{ parameters.reportFolder }}/../../tools/Deploy-PbiReport.ps1" arguments: > -ReportName "$(reportName)" -WorkspaceName "$(targetWorkspace)" -PbixFilePath "$(System.DefaultWorkingDirectory)/${{ parameters.reportFolder }}/Report.pbix" -IsDeleted $$(isDeleted) failOnStderr: true env: AZURE_TENANT_ID: $(AzureTenantId) AZURE_CLIENT_ID: $(AzureClientId) AZURE_CLIENT_SECRET: $(AzureClientSecret)
The template defines a pipeline stage which:
loads shared secrets from the
DeploymentSecrets
variable group (line 8)uses the
reportFolder
parameter value to locate the report's configuration variables (line 9)uses the
workspaceName
variable loaded on line 9 to set the target workspace in a newtargetWorkspace
variable (lines 16-23)publishes the Power BI report (lines 26-40), using secrets loaded on line 8, report metadata loaded on line 9, and file locations based on the
reportFolder
parameter value.
The PowerShell deployment script is now referenced (line 30) in a separate tools
directory, at the same level as the reports
folder (and visible in the screenshot above) – this is also the location in which this stages template file (named report-pipeline-stages.yaml
) is stored.
The files and folder structure described here are available on GitHub – there's a link at the end of the post.
Implement report pipelines
Now that the body of the pipeline has been extracted into the stages template, each report's YAML pipeline has two functions:
- Define when the pipeline runs
- Reference the stages template
Here's the revised deployment pipeline for my Executive Summary sample report:
trigger: branches: include: - rc/* - main paths: include: - /powerbi-pro-devex-series/04-MultipleReports/reports/ExecutiveSummary pr: branches: include: - rc/* paths: include: - /powerbi-pro-devex-series/04-MultipleReports/reports/ExecutiveSummary stages: - template: ../../tools/report-pipeline-stages.yaml parameters: reportFolder: /powerbi-pro-devex-series/04-MultipleReports/reports/ExecutiveSummary
This is much simpler than earlier versions of this pipeline! Other reports' pipeline files are similar to this one, differing only in the report folder path.
The report folder path appears three times in the pipeline file, which seems redundant but is unavoidable – trigger
and pr
trigger definitions cannot be placed in a template and cannot use variables.
Finally, the new pipeline YAML file must be pushed to the central Git repository so that you can use it to create an Azure DevOps pipeline. Using a variable group means that you no longer need to create secret variables for the new pipeline, but make sure that the pipeline has permission to access the variable group as I described earlier.
Creating new reports
When you create a new report you also need to create its deployment pipeline, but the overhead of doing so is much lower with a pipeline pattern like the one I've described here. When creating a report you must:
Create a PBIX file. Continue to create Power BI desktop files as you do normally – if you have a blank report template you like to use, do that – but save your file in a new subfolder of the
reports
folder, and name itReport.pbix
.Create a pipeline file. Copy an existing
pipeline.yaml
file to the report's subfolder, then update the three report location values in the file.Create a metadata file. Copy an existing
metadata.yaml
file to the report's subfolder, then configure the new report's metadata.Push your feature branch. You've done steps 1-3 in a Git feature branch, right? 😜 Push the branch to your central Git repository!
Create the Azure DevOps pipeline. Create an Azure DevOps pipeline referencing your new
pipeline.yaml
file. Remember that you'll need to choose the file in your feature branch (because it doesn't exist in other branches yet).
Now you've enabled automatic deployment of your report 😀.
I take this a step further in a later post, scripting the report setup process and creating its DevOps pipeline automatically.
Summary
In this article, I refactored the report deployment pipeline to make it as simple as possible to reuse, and provided five simple steps for doing so.
If I can explain the steps to create a new report, I should be able to automate them – I'll do exactly that in a later post 😊.
Next up: In the last couple of posts, I've looked at deploying multiple Power BI reports through a sequence of different workspace environments. In the next post, I extend this approach to shared datasets, and look at managing the consequences that has for report deployment.
Code: The code for the series is available on
Github. The files specific to this article are in the
powerbi-pro-devex-series/04-MultipleReports
folder.Share: If you found this article useful, please share it!