Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
mollycat
Helper II
Helper II

Dataset refresh via Power Automate Flow - clarifications

Hello! I have set up a Power Automate flow to refresh a dataset in Power BI. My understanding is that because I am working in a Premium capacity workspace, that we can have up to 48 scheduled refreshes per day (and that this is per dataset, not per workspace), but that on-demand/manual refreshes do not count towards this limit. Please correct me if any of these items are incorrect.

My questions are three:

 

1. I'm assuming that, even though the semantic model refresh is turned off, that refreshes done via this flow will count towards the 48 scheduled refresh limit?

 

Screenshot 2025-08-13 193140.png

 

2. Is there a way to add a button or some sort of trigger to the report within the Power BI App so that an end user can trigger the flow?

 

3. If yes to number 2, would these "manual" triggers of the flow also count towards the 48 refresh limit, or would they be considered manual runs and therefore not count towards the limit?

 

Thank you!

1 ACCEPTED SOLUTION
v-agajavelly
Community Support
Community Support

Hi @mollycat ,

Unfortunately, there’s no sneaky way around this one in Premium capacity, anything that refreshes your dataset via Power Automate (or any other API) does count toward the 48-per-day limit for that dataset, even if you trigger it manually with a button inside the report. The only refreshes that don’t count are when you (or someone with permission) click REFRESH NOW in the Power BI Service. The good news is you can add a Power Automate button in the report so end users can kick off a refresh themselves  it just means each click will still eat into that 48-refresh daily quota. If you really need more than 48 automated/API refreshes a day, the only real workaround today is to go through the XMLA endpoint, which doesn’t have the same cap.
More details here in Microsoft’s official documentation: Data refresh in Power BI - Power BI | Microsoft Learn

Hope that clears things up. If the above solution works for you, please accept it as the solution so that other community members with similar issues can benefit.

Regards,
Akhil.

View solution in original post

8 REPLIES 8
v-agajavelly
Community Support
Community Support

Hi @mollycat ,

That screenshot says it all really appreciate you sharing it! Seeing all those Data Factory refreshes running every couple of minutes and completing successfully is a great confirmation that the pipeline approach works differently than the standard refresh mechanism.

I’m with you on the thought that this might be a Preview behavior but if Microsoft decides to keep it this way when it goes GA, it could completely change the game for scenarios where frequent refresh is critical.

One thing I’d suggest (just to be safe) is keeping an eye on capacity utilization in the Fabric Admin metrics, especially if you plan to keep those 2-minute intervals running in production. Even if there’s no hard refresh limit here, resource consumption will still matter.

Thanks again for circling back with the details and proof this is going to be super helpful for others.

Regards,
Akhil.

v-agajavelly
Community Support
Community Support

Hi @mollycat ,

No problem at all, happy to keep the conversation going.

On the Direct Query vs Import part you’ve described exactly the trade-off most folks run into. Direct Query avoids refresh limits, but the report experience can really drag because every click is hitting the source system. Even if your data isn’t huge, the constant back-and-forth kills performance. Import mode almost always gives a snappier user experience, and since your refresh is only ~20 seconds, that’s actually a great sign that Import is the better fit here.

If you do stick with Import, then yes the 48/day refresh limit is the ceiling using the “standard” methods. The Fabric Data Pipelines (Preview) option you found is interesting, because it’s designed for more robust, automated data movement/orchestration. But right now, it doesn’t magically bypass the 48/day cap for dataset refreshes – it still calls the same underlying refresh process. So, from a limits perspective, you’d be in the same place as with Power Automate.

Long-term, if you expect to need more than 48 refreshes, XMLA endpoint is still the only truly “out of the box” way to get around that cap today. It does add a bit of setup and scripting, but it’s reliable once in place.

so to summarize.

  • Direct Query: avoids refresh limits but will stay slow unless your model is extremely simple and the source DB is lightning fast.
  • Import: better performance, fast refreshes in your case, but subject to the 48/day limit.
  • Data Pipelines (Preview): great for orchestration, but doesn’t bypass the refresh cap.
  • XMLA endpoint: only current way to go past 48/day refreshes.

If performance is the bigger pain point (sounds like it is), I’d lean Import mode + plan around the 48 limit (or XMLA if you truly need more.

Regards,
Akhil.

Hi @v-agajavelly, apologies for the delayed response. Thank you, as always, for your very helpful insight.

 

In my last reply, I hadn't been able to try the Data Pipeline option yet as it was not showing as available. I have since been able to set up a semantic model refresh and wanted to post here in case anyone in the future may have this same question. Using the "create advance refresh" option as shown here: Refresh a Semantic Model Using Data Pipelines (Preview) - Power BI | Microsoft Learn, I was able to set up a refresh that runs every 2 minutes, just to test. It has been running for several hours and I have not yet received a refresh failure. In the semantic model refresh history, this is showing as "Data Factory". I do wonder if the seemingly unlimited refresh capacity is only available since this is in Preview mode? But for now, it is allowing me to refresh more than 48 times and is showing as "Completed". I may be missing something here in terms of the 48-refresh limit, but wanted to share this update.

 

Screenshot 2025-08-24 161718.png

v-agajavelly
Community Support
Community Support

Hi @mollycat ,

Really glad that helped. Let me go through your follow-ups one at a time.

  1. Manual refresh in the App
    Yep, you’ve got it right that “manual refresh” I was talking about is only something you can do in the workspace itself. Once the report is in an App, that button just isn’t there, even if the person has the right permissions and the gateway set up.
  2. XMLA endpoint
    I’ve used XMLA quite a bit. It’s super flexible and will happily get around the 48/day limit, but it’s a bit more “hands-on” than Power Automate. You’re usually writing TMSL scripts or using SSMS/PowerShell, so it’s more technical. The main things I’ve learned:
  • Make sure you’ve got the right access set up for whoever’s running it.
  • Keep an eye on overlapping refreshes so you don’t accidentally overload the capacity.
  • Treat access carefully XMLA can do a lot more than just refresh.

If you’re fine with a bit of scripting, it’s a solid option.

  1. What happens if you hit the 48 limit
    When you reach that cap, Power BI will just say “nope” to any more API/flow refreshes that day. The refresh fails, you’ll see it in the refresh history, and the dataset owner (plus anyone else set to get alerts) will get an email saying it failed. The error message is pretty clear that you’ve gone over the limit.

Hope that clears things up, and nice to hear you’re already digging into XMLA  it’s worth the time if refresh frequency is a big deal for you.

Thanks,
Akhil.

Amazing, thank you SO much @v-agajavelly, this is extremely helpful!

Hello @v-agajavelly, I've done some additional research in the time from our last exchange and have a few more questions, if you don't mind 🙂

 

  1. This report was actually set up originally in Direct Query mode to avoid the need for the refresh schedule consideration. When in Import mode, the data refresh takes about 20 seconds, which I'm assuming would be similar to load the data via Direct Query. However, the time it takes to reload the report visual (a Matrix and some filters) is extremely slow...making the report basically unusable. I'm wondering if pivoting back to Direct Query is worthwhile if there is a way to improve the report visual reload performance?
  2. If not, and the best method is to move forward with Import mode (again, semantic model refresh is very fast, so that is not a limitation), do you know if any of the Fabric features that are currently in preview mode may help solve this? I'm looking at Refresh a Semantic Model Using Data Pipelines (Preview) - Power BI | Microsoft Learn and, while I'm open to the XMLA endpoint, I wonder if something out of the box will be more sustainable longterm, assuming of course that this would allow 48+ refreshes.

 

Thank you, Akhil!

v-agajavelly
Community Support
Community Support

Hi @mollycat ,

Unfortunately, there’s no sneaky way around this one in Premium capacity, anything that refreshes your dataset via Power Automate (or any other API) does count toward the 48-per-day limit for that dataset, even if you trigger it manually with a button inside the report. The only refreshes that don’t count are when you (or someone with permission) click REFRESH NOW in the Power BI Service. The good news is you can add a Power Automate button in the report so end users can kick off a refresh themselves  it just means each click will still eat into that 48-refresh daily quota. If you really need more than 48 automated/API refreshes a day, the only real workaround today is to go through the XMLA endpoint, which doesn’t have the same cap.
More details here in Microsoft’s official documentation: Data refresh in Power BI - Power BI | Microsoft Learn

Hope that clears things up. If the above solution works for you, please accept it as the solution so that other community members with similar issues can benefit.

Regards,
Akhil.

Thank you @v-agajavelly, this is the information I needed! A few quick follow ups:

1. For the manual refresh (that does not count towards the capacity limit), I understand from what you've explained that this is only available in the Power BI workspace, meaning there is no way for a user to trigger this from within the App, is that correct? This is assuming that they do have access to the gateway, which I believe is the requirement to be able to do a manual refresh.

2. I've found some helpful information about the XMLA endpoint and will definitely explore it as an option. If you have used this method, have you found that there are any major challenges or considerations based on your experience?

3. In the event that a user(s) trigger the flow and the 48 refresh limit is reached, what happens after that? Does Power BI just reject the refresh? Is an email triggered to the semantic model owner, or is there an error?

 

Again, thank you so much!

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors