Reusable deployment pipelines for Power BI

Power BI Azure Pipelines This is Part 4 of a series about creating a professional developer experience for working with Power BI. If you haven't already seen the first, you may prefer to start there.

In the first post in this series I looked at publishing a Power BI report automatically, then in the previous post I created a pipeline to promote a report through a series of Power BI workspaces (corresponding to user environments).

But what happens when – of course – you need to manage many reports? You could:

  • Use a single pipeline to deploy every report, republishing everything any time one report is modified.
  • Build one pipeline per workspace (behaviour similar to Power BI deployment pipelines).
  • Implement one pipeline per report.

The pipeline-per-report model offers precisely targeted deployments, because a report is only ever published when it has been modified. If you want to see how that might look in practice, in this video (5m26s) I update a few reports – using the workflow I described in the previous post – and let their deployment pipelines take care of everything else:

Creating one pipeline per report means that each report is only deployed when it has changed. To be really clear, this means that:

  • any changed report is published automatically
  • every changed report is published automatically
  • a report is only published if it has changed

…and all this happens without you doing anything special after initial report creation. This supports precisely targeted deployments – you publish only the reports that need to be published, no more, no less.

Implementation of one pipeline per report makes additional demands of a developer when creating a new report. To make this easier to manage, in this post I look how to make pipeline creation as simple as possible, by building each pipeline from a set of reusable components.

Simplifying pipeline implementation isn't just about making life easier for a developer today, but also in the future. What happens if my development workflow changes? What happens if I introduce additional environments? When required, I want the flexibility to accommodate changes with minimum effort.

Implementing common features of a pipeline as reusable components makes this objective easier to achieve. When an element of pipeline functionality needs to change, I can update the corresponding component in one place, but modify the behaviour of every pipeline. To factor out common components I will:

Secret variables such as deployment credentials must be stored securely, outside version control systems. In previous posts I have defined secret variables for use by a single pipeline, using the Azure DevOps UI – this approach is undesirable when multile pipelines are required, because secrets must be maintained separately for every pipeline.

A better alternative is to create secret variables in a variable group. Variable groups can be referenced by multiple pipelines, enabling the variables they contain to be used by any referencing pipeline.

Variable groups are created in the “Pipelines” area of the Azure DevOps UI – choose “Pipelines” → “Library” in the sidebar menu, then click “+ Variable group”:

When creating a new group you can choose either to specify secret variables directly, or to link the group to an Azure Key Vault. Linking to a key vault is a good choice if you are already using the key vault service to manage secrets.

Once you have created a group, added at least one variable, and saved it, the option to set Pipeline permissions becomes available:

You can allow individual pipelines to access the variable group, or can open access to all pipelines in the project. “Open access” need not be insecure if you have a project dedicated to Power BI report management, and is simpler than having to grant access explicitly to every new report pipeline.

When creating secret variables in the group, make sure you use the lock button (padlock icon) to mark the variable secret.

An Azure DevOps template is a reusable, parameterisable YAML file that specifies part of a pipeline that can be used by other pipelines. Templates can be used to specify variables, job steps, jobs or entire pipeline stages.

For ease of configuration, I'm going to extract report-specific pipeline variables for each report into its own separate variables template file, named metadata.yaml. Strictly speaking this isn't necessary, because each report needs its own pipeline YAML file (even if only a basic one), but it cleanly separates configuration data from deployment code.

Here's the metadata.yaml file for my Executive Summary sample report:

  reportName: 'Executive Summary'
  description: |
    A summary of sales performance for company executives.
  workspaceName: 'AdventureWorks Reports'
  isDeleted: false

Notice that it includes the report's display name, and the workspace into which it is published – details which were previously hard-coded into the pipeline definition. It also includes two new variables:

  • description provides a description of the report. You might not need to use this in the pipeline – in fact I won't. Including it here illustrates that you could use report metadata to store any report information in version control, even if it's not needed by the pipeline.
  • setting isDeleted to true will allow me to “unpublish” reports (delete them from Power BI workspaces) – a revised PowerShell deployment script is included in the files on GitHub accompanying this post.

Storing report metadata in configuration files is good practice because it can be version controlled 😀. Don't be tempted to store secret values in version control, because doing so is insecure – use variable groups for these!

Notice that a report now has three components – its PBIX file, a pipeline.yaml pipeline definition file and now the metadata.yaml configuration file. To make this easier to manage, I'm going to organise my report files in version control using a common structure:

Report-related files are stored under the top level “reports” folder (indicated in the screenshot). Each report has its own subfolder, containing three files which always have the same names:

  • the report's PBIX file, Report.pbix
  • the deployment pipeline definition, pipeline.yaml
  • report configuration settings, metadata.yaml.

This is a YAML stages template, based on the deployment pipeline I've been using in previous posts. You can tell it's a template because it specifies a parameter (lines 1-3) called reportFolder – a value for this parameter must be provided by any YAML pipeline that uses the template.

  1. parameters:
  2. - name: reportFolder
  3.   type: string
  5. stages:
  6. - stage: Jobs
  7.   variables:
  8.   - group: DeploymentSecrets
  9.   - template: ${{ parameters.reportFolder }}/metadata.yaml
  10.   jobs:
  11.   - job: Job
  12.   displayName: Publish Power BI report
  13.   pool:
  14.   vmImage: ubuntu-latest
  15.   variables:
  16.   - name: targetWorkspace
  17.   ${{ if eq(variables['Build.SourceBranchName'], 'main') }}:
  18.   value: '$(workspaceName)'
  19.   ${{ elseif eq(variables['Build.Reason'], 'PullRequest') }}:
  20.   value: '$(workspaceName) [Test]'
  21.   ${{ elseif startsWith(variables['Build.SourceBranch'], 'refs/heads/rc/') }}:
  22.   value: 'AdventureWorks Reports [UAT]'
  23.   steps:
  24.   - task: PowerShell@2
  25.   displayName: Publish report
  26.   inputs:
  27.   targetType: filePath
  28.   filePath: "$(System.DefaultWorkingDirectory)/${{ parameters.reportFolder }}/../../tools/Deploy-PbiReport.ps1"
  29.   arguments: >
  30. -ReportName "$(reportName)"
  31.   -WorkspaceName "$(targetWorkspace)"
  32.   -PbixFilePath "$(System.DefaultWorkingDirectory)/${{ parameters.reportFolder }}/Report.pbix"
  33.   -IsDeleted $$(isDeleted)
  34.   failOnStderr: true
  35.   env:
  36.   AZURE_TENANT_ID: $(AzureTenantId)
  37.   AZURE_CLIENT_ID: $(AzureClientId)
  38.   AZURE_CLIENT_SECRET: $(AzureClientSecret)

The template defines a pipeline stage which:

  • loads shared secrets from the DeploymentSecrets variable group (line 8)

  • uses the reportFolder parameter value to locate the report's configuration variables (line 9)

  • uses the workspaceName variable loaded on line 9 to set the target workspace in a new targetWorkspace variable (lines 16-23)

  • publishes the Power BI report (lines 26-40), using secrets loaded on line 8, report metadata loaded on line 9, and file locations based on the reportFolder parameter value.

The PowerShell deployment script is now referenced (line 30) in a separate tools directory, at the same level as the reports folder (and visible in the screenshot above) – this is also the location in which this stages template file (named report-pipeline-stages.yaml) is stored.

The files and folder structure described here are available on GitHub – there's a link at the end of the post.

Now that the body of the pipeline has been extracted into the stages template, each report's YAML pipeline has two functions:

  1. Define when the pipeline runs
  2. Reference the stages template

Here's the revised deployment pipeline for my Executive Summary sample report:

      - rc/*
      - main
      - /powerbi-pro-devex-series/04-MultipleReports/reports/ExecutiveSummary
      - rc/*
      - /powerbi-pro-devex-series/04-MultipleReports/reports/ExecutiveSummary

- template: ../../tools/report-pipeline-stages.yaml
    reportFolder: /powerbi-pro-devex-series/04-MultipleReports/reports/ExecutiveSummary

This is much simpler than earlier versions of this pipeline! Other reports' pipeline files are similar to this one, differing only in the report folder path.

The report folder path appears three times in the pipeline file, which seems redundant but is unavoidable – trigger and pr trigger definitions cannot be placed in a template and cannot use variables.

Finally, the new pipeline YAML file must be pushed to the central Git repository so that you can use it to create an Azure DevOps pipeline. Using a variable group means that you no longer need to create secret variables for the new pipeline, but make sure that the pipeline has permission to access the variable group as I described earlier.

When you create a new report you also need to create its deployment pipeline, but the overhead of doing so is much lower with a pipeline pattern like the one I've described here. When creating a report you must:

  1. Create a PBIX file. Continue to create Power BI desktop files as you do normally – if you have a blank report template you like to use, do that – but save your file in a new subfolder of the reports folder, and name it Report.pbix.

  2. Create a pipeline file. Copy an existing pipeline.yaml file to the report's subfolder, then update the three report location values in the file.

  3. Create a metadata file. Copy an existing metadata.yaml file to the report's subfolder, then configure the new report's metadata.

  4. Push your feature branch. You've done steps 1-3 in a Git feature branch, right? 😜 Push the branch to your central Git repository!

  5. Create the Azure DevOps pipeline. Create an Azure DevOps pipeline referencing your new pipeline.yaml file. Remember that you'll need to choose the file in your feature branch (because it doesn't exist in other branches yet).

Now you've enabled automatic deployment of your report 😀.

I take this a step further in a later post, scripting the report setup process and creating its DevOps pipeline automatically.

In this article, I refactored the report deployment pipeline to make it as simple as possible to reuse, and provided five simple steps for doing so.

If I can explain the steps to create a new report, I should be able to automate them – I'll do exactly that in a later post 😊.

  • Next up: In the last couple of posts, I've looked at deploying multiple Power BI reports through a sequence of different workspace environments. In the next post, I extend this approach to shared datasets, and look at managing the consequences that has for report deployment.

  • Code: The code for the series is available on Github. The files specific to this article are in the powerbi-pro-devex-series/04-MultipleReports folder.

  • Share: If you found this article useful, please share it!