Skip to content

Are you backing up your Power BI dataflows? Here are two ways to ensure you never lose them.

When developing a Power BI dataflow you should consider how you are going to backup all that hard work. Have you ever lost hours, days, or even weeks by accidentally deleting or forgetting to save and backup a file? Don’t be a victim of this with dataflows.

Below are two examples of how to automate the backup of a dataflow and easily recover it in case you or someone else deletes it.

Hasta la vista

Requirement

Need the ability to backup Power BI dataflows in an automated fashion.

Solutions

  1. Enable Azure storage connection on a Power BI workspace
  2. Use Power Automate or Azure Logic Apps and Power BI APIs to export dataflow definition to a JSON file in SharePoint or Azure Data Lake.

Option 1 – Azure Storage Connection

The first option is to enable the Azure storage connection either on the Power BI tenant or a Power BI workspace. Note, this must be enabled before any dataflows have been created on a workspace.

Steps

  1. Enable Power BI tenant-level or workspace-level storage permissions
  2. Point Power BI workspace to a storage account

Enable Storage Permissions

A Power BI admin must either enable tenant-level or workspace-level storage permissions. This can be found here: Power BI BYO Resources

Enabling workspace-level storage permissions offers more flexibility to divide data across different storage accounts.

Associate Power BI Workspace with Storage

Once either setting has been enabled workspace admins can point workspaces to storage accounts in the settings of a workspace. They will only be able to use storage accounts they have the following permissions enabled for: Azure Blob Data Owner, Azure Blob Data Reader, and the Owner role either on the resource or inherited.

Once the connection has been established and a dataflow has been developed, the data lake will keep a copy of the dataflow definition. This includes a snapshot every time the dataflow is changed. Each workspace attached to storage will have a folder within the powerbi container with the current definition (model.json) and a folder for all snapshots. To recover a deleted dataflow download the model.json file from the lake and use the last step of this post to upload it back to Power BI.

Option 2 – Power Automate or Azure Logic Apps

If you have existing dataflows on a workspace that is not connected to Azure Storage, Power Automate or Azure Logic Apps can accomplish the same backups. One advantage to using this method is the backup can be stored in file storage options other than a data lake. The steps below lay out how to build a flow using the Power BI API to store backups in either SharePoint or Azure Data Lake Gen2.

Steps

  1. Create an app registration (Service Principal) in Azure AD
  2. Add Service Principal to Azure AD security group
  3. Update Power BI Admin settings to allow security group to use Power BI APIs
  4. Add Service Principal to Power BI Workspace
  5. Create a flow in Power Automate or Azure Logic Apps to orchestrate the backup

Create App Registration

The app registration created in this step will be used to authenticate against Power BI’s API.

In Azure AD add a new registration from the app registrations menu

Give the app registration a name, select the appropriate supported account types, and leave the Redirect URI blank.

Once the app registration is created, generate a certificate or secret. For this example we will use a secret as they are far easier to setup. In the secret setup screen select an appropriate expiration for your environment.

Once the client secret has been generated ensure you copy it and store it in a safe place. You will not be able to retrieve it in the future. If you lose the secret a new one can be generated.

The last step for the app registration is to add the Power BI API permissions. Find the Power BI Service under APIs my organization uses in the API permissions section

Under delegated permissions there will be many options that can be granted to the app registration. For this blog’s example of backing up Power BI dataflows the Dataflow.Read.All is the only one required.

Add App Registration to AD Security Group

Power BI only allows the use of security groups, or the entire organization, for admin settings. It is our recommendation to limit API access to one or many security groups rather than the entire organization.

Back in Active Directory, on the Group menu, select new group. Create a security group and add the app registration as a member

Update Power BI Admin Settings

You will need to have the Power BI Administrator Role to make the following change.

In the Power BI Admin Portal, under Developer Settings, enable the “Allow service principals to use Power BI APIs” setting for the security group.

Add Service Principal to Power BI Workspace

On the workspace that contains the dataflow add the Service Principal as an admin. In our testing adding the security group in the previous step did not work. If you named your Service Principal and AD group the same ensure you select the one with the AppID

Create Flow

The flow to automate the dataflow backup can be created with either Power Automate or Azure Logic Apps. For this example we will use Power Automate. One note for Power Automate users is this flow will use the Premium HTTP connector. If you choose to write the files to an Azure Data Lake, that is also a Premium connector.

Scheduled Cloud Flow

This flow could be triggered at any interval, for this example we will use a scheduled repeat of each day.

HTTP Connector

In the next step of the flow select the HTTP connector and populate as described below:

  1. Method – Select GET
  2. URI – This is the path to the API for the specific dataflow to be backed up. The URI will include the GUID for the workspace and the dataflow. These can be found in the URL for the dataflow:

https://api.powerbi.com/v1.0/myorg/groups/{workspace guid}/dataflows/{dataflow guid}

3. Authentication – Select Active Directory OAuth
4. Tenant – Tenant ID from the app registration
5. Audience – https://analysis.windows.net/powerbi/api
6. Client ID – Client ID from the app registration
7. Credential Type – Select Secret
8. Secret – Secret value from the app registration

File Storage Option 1 – Write File to SharePoint

For a little organization within SharePoint start with a ‘Create new folder’ action. After authenticating, select the site, list or library, and provide a folder path. The example below is dynamically creating a new folder for the date the flow is run using the formatDateTime(utcNow(),’MM-dd-yyyy’) expression.

In the final flow step select the SharePoint ‘Create file’ action.

  1. Site Address – Select the same site address as the create folder step
  2. Folder Path – Select ‘Full Path’ from the Create new folder step in the dynamic content menu
  3. File Name – To get the name of the dataflow use body(‘HTTP’)?[‘Name’] in the expression menu. Note that if you renamed the HTTP step, replace HTTP with the name you used. Use the utcNow() expression to include a timestamp if you plan to run multiple times a day. Ensure the file name ends with a .json file extension.
  4. File Content – Select ‘Body’ from the HTTP step in the dynamic content menu.

Now, run the flow.

File Storage Option 2 – Write File to Data Lake

For this option the .json file will be written to an Azure Data Lake. Unlike the SharePoint option there is only one step involved as dynamic folder creation is possible with the Azure Blob Storage connector.

This example will use the access key authentication type, you could also use the Azure AD Integrated option if preferred. To retrieve an access key open the storage account in Azure and copy either key1 or key2:

Back in Power Automate select the ‘Create blob (V2)’ action from the Azure Blob Storage connector. Enter your Storage Account name and key:

Once created, populate the fields based on the details below:

  1. Storage account name – Select Use connection settings. This is the connection you created in the previous step.
  2. Folder path – Starting with the container give the folder path for the file location. In this example a folder is being dynamically created using the formatdatetime(utcNow(),’MM-dd-yyyy’) expression. This will create a folder for each day the flow is run.
  3. Blob name – To get the name of the dataflow use body(‘HTTP’)?[‘Name’] in the expression menu. Note that if you renamed the HTTP step, replace HTTP with the name you used. Use the utcNow() expression to include a timestamp if you plan to run multiple times a day. Ensure the file name ends with a .json file extension.
  4. Blob content – Select ‘Body’ from the HTTP step in the dynamic content menu.

Now, run the flow.

Recover Dataflow from Backup

To recover a dataflow from the backups download the file from either SharePoint or the data lake and use the Import Model option. On the Power BI workspace you want to recover the dataflow to, select New -> Dataflow. Then, select the Import Model option and select the .json file.

Final Thoughts

It is a good practice to ensure that you are backing up any work that you do. Hopefully this serves as a reminder of that, especially with labor intensive tasks like building dataflows.

The introduction to the Power BI APIs was also intentional because they enable a lot of power in the administration of Power BI. If you need any help getting started with Power BI automation or how to leverage the Power BI APIs click the chat button.

Related Microsoft Docs

1 thought on “Are you backing up your Power BI dataflows? Here are two ways to ensure you never lose them.”

  1. Hello,
    I really appreciate you taking the time out to workout this game plan. I followed it pretty well for backing up data flows to Azure Data Lake Gen2, but when I run the Power Automate, I am getting error: msg “InternalServerError” within the HTTP call. I am unsure why is not working. Also for Workspaces with existing data flows. Would it be a good idea to export them, delete them, connect to Azure Data Lake Gen2, then import back the data flows?

Leave a Reply

Your email address will not be published. Required fields are marked *