top of page
Millington Offices (10).png
Mara Pereira

Dataflows Power Automate connector: fully automate your data refresh

Updated: Mar 3, 2021


If you've been reading the Power BI announcements, you probably know already that there is a very exciting new integration with Power Automate: the Dataflows Power Automate connectors!


For those of you that don't know about this new connectors yet and are not sure what you can do with it:

Imagine you have built your dataflow, and on top of it you've built your shared dataset that is being used in multiple reports

You've set up your scheduled data refresh, both in your dataflow and your dataset

Your dataflow is scheduled to run 8x times a day, first run at 8AM, and the first run of your dataset is scheduled for 8.30AM



What if your dataflow takes longer than 30min to refresh?

How are you sure you're always looking at the latest data in your reports?


One month ago, there was really no way around this issue, but since December 2020, we have a new weapon!


With the new Power Automate Dataflows connector you can configure your dataset to refresh the data immediately after your dataflow refresh finishes.


This makes the process of data refresh across datasets and dataflows a lot more streamlined and consistent.


In this post, I'm going to guide you on how you can completely automate the refresh of files stored in a SharePoint folder using the new Dataflows Connector.

In summary, we will build the following process with two Flows, the Dataflow Refresh Flow, and the Dataset Refresh Flow:


Dataflow Refresh Flow:

Checks if there is a new file/changes in an existing file in a specific SharePoint folder. If this condition is true, this Flow will try to refresh the Dataflow.


Dataset Refresh Flow:

If the previous Dataflow Flow was successful, meaning the dataflow refresh was successful, then refresh a specific dataset and send a notification to designated users once finished, otherwise, if the dataflow refresh failed, send an email notification to a list of specific users with refresh failure details.


Let's check how you can do this!



1) Create your Automated Cloud Flow with a Dataflow Refresh Complete trigger:

Click on "Create"

Choose the "Automated Cloud Flow"

Give a name to your flow and under "Choose your flow's trigger" search for the word dataflow

Choose the "When a dataflow refresh is completed" trigger

Click on "Create"




2) Once you've created your "Dataflow Refresh Completes" step, it's time to make sure everything went ok...

You will notice the trigger step is already in place. With this step, your flow will automatically run whenever the trigger is activated, in this case, whenever the dataflow of your choice finishes it's refresh.


The next step is what we call an action. When the trigger is activated (again, when your chosen dataflow finishes it's refresh successfully), it will trigger a new action. We want to refresh our Power BI dataset when the dataflow's refresh is successful, so here is what you need to do:

On the "When a dataflow refresh completes" step choose:

Group Type: Workspace

Group: refers to the name of the workspace where the dataflow you want to use is

Dataflow: name of the dataflow you want to use (Avocado Prices for me)


Click on "+ New Step"

Select "Condition"

The reason we are adding the Condition step is to verify if the Dataflow Refresh was successful. If yes then we refresh the dataset. If not we can send an email to the data owners (e.g.) so that they can investigate why the dataflow refresh failed.

On the first value of the condition select "Refresh Status"

Leave the comparison as "Equal to"

On the second value of the condition write "Success"




3) Let's look at the "If Yes" branch of the condition.

So, in summary, if the dataflow is successfully refreshed, we want to start automatically our dataset refresh. We would also like to send a notification via Teams to specific users, informing them that the refresh was successful once everything is completed.

On the "If Yes" branch, click on "Add an action"

In the search box, search for "dataset refresh"

Select the Refresh a dataset (preview) option and choose:

Workspace: Workspace where the dataset you want to refresh is. It can be in a

different workspace from the dataflow you used in the previous step

Dataset: The name of the dataset you want to refresh

Click again on "Add an action"

Search for "Teams" on the search box and select the "Post a message as the Flow bot to a user". You can also choose to send the message to a Teams Channel or even send an email notification!

Recipient: users you want to alert. This needs to be individual users, it doesn't

accept a group as an entry for this field

Message: Say something stating the Dataflow and Dataset you chose were

successfully refreshed

Header: Optional field, add something like "Dataflow refresh successful"



An idea on how to write a Teams Bot Message:




4) Configuring the "If No" branch to send email notifications to the data owners

If the dataflow refresh is not successful, we want our Flow to automatically send an email notification to the Data Owners, with some basic information about our refresh attempt.


Click on "Add an action"

On the search box write "Email" and select "Send an email notification (V3)"

Define your email notification, filling the "To" (this time it can be a Group), "Subject" and "Body" fields. Be creative here :)


Save and Test your Flow!!



+ GO THE EXTRA MILE +


If you want to completely automate your dataflows refresh, there is another new Power Automate connector called "Refresh a dataflow".


This means that you can, for example, refresh your dataflow only when new data is added to your data source.


Think about it... your complete refresh process would be automated!


The complete process would be:

New data is added to data source Flow is triggered Dataflow is refreshed

Dataflow is successfully refreshed Dataset is refreshed Users are notified



Let's do an example with files in a SharePoint folder.

What we want is: whenever a new file is added or modified on a SharePoint folder, refresh the associated dataflow.


Once the dataflow refresh is successful, the first Flow we built (Dataset refresh flow) will be triggered. The dataset will be refreshed and the users notified once the entire process is finished.


Click on "Create"

Choose the "Automated Cloud Flow"

Give a name to your flow and under "Choose your flow's trigger" search for the word Sharepoint List

Choose the "When a file is created or modified in a folder" trigger

Choose the SharePoint site and folder you're using as the data source to your dataflow

Click on "+New step"

In the search box write "dataflow"

Choose the "Refresh a dataflow (Preview) option"


Save and Test your Flow!!


Wondering why we didn't add a Condition step in this flow to check if the refresh was successful?

We already have that condition in the "Dataset refresh" flow, as it verifies if the dataflow refresh was successful before refreshing the dataset.




Notes


1) The Power BI Dataflows connectors in Power Automate are Premium Connectors.

You will need a license to use these connectors once they become GA.


2) You can't have more than one trigger in each Flow

This is the reason you should create two separate Flows if you want to fully automate your refresh process: one Flow to detect changes in the data source and refresh the dataflow, and another one to refresh the dataset once the dataflow refresh is finished.


3) Refresh frequency limitations

Existing limits on refreshes apply when running the refresh dataset action in Power Automate. For datasets in shared capacity used by Power BI Pro, your refresh action is limited to 8 refreshes per day. In Premium capacity, there is not limitation on the number of refreshes per day, although you are limited by the available resources in your capacity.


9,439 views0 comments

Comments


bottom of page