Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
I have a premium p1 SKU. Dataflows are in a premium workspace. Dataflows feed into several datasets.
Question: Previously, I was updating each dataflow in a schedule like this:
2023 dataflow 8am and every hour therafter
2024 dataflow 8am same
Then the "All" dataflow I was updating at 8:30 (and every half hour thereafter).
Then the datasets that these feed into were being updated on the half hour as well (along with the All dataflow).
I had read somewhere that based on having "linked dataflows", which you can see below, I would not need to update the yearly dataflows, and could just update the "All" dataflow. So I turned off their refresh. But I've now tested this for a few days and the yearly dataflows are definitely not updating the "All" dataflow.
Recommendations on how I should set this up? The only dataflow I bring into the datasets is the "All" dataflow. The question really is about how to set up the refresh schedules (of the yearly dataflows, the "all" dataflow, and the dataset itself). And why are the "linked" dataflows not updating my "All" dataflow.
Any help is appreciated!
@lbendlin Yes, I would love to go that route, but our IT dept has disabled that ability due to data sharing policies. So I'm back to my original question as to what is the best way to handle the refresh of the (for example) 2023 and 2023 dataflows, and the "all" dataflow (which contains both of these dataflows), then the dataset itself. Thoughts?
You can recreate the Power Automate features with calls to the Power BI REST API. Or did your IT department block that too?
@lbendlin I have never tried it, but looking at the documentation it seems to say that this requires a service prinicipal account, which I do not have.
I am using it with my regular AAD account. Don't let the name of the login cmdlet throw you off.
Consider using Power Automate instead. It has triggers for "when a dataflow refresh completes". You can then initiate another dataflow refresh, or a semantic model refresh. Sadly there is no "when a dataset refresh completes" trigger but you can emulate that through polling of the refresh status.
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
User | Count |
---|---|
34 | |
32 | |
19 | |
12 | |
8 |
User | Count |
---|---|
52 | |
37 | |
29 | |
14 | |
12 |