Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
russellj
Frequent Visitor

How to refresh all data sources for report in one shot?

Hello,

 

Given the following setup for a non-scheduled refresh report:

 

russellj_0-1709913590394.png

 

I end up having to one at a time refresh the Dataflow at the top (from another workspace), then refresh the Dataflow below it (part of this workspace), then refresh the semantic model to get the report to show new data. I have to wait for one to complete to start the next, making this very tedious to do. Why doesn't refreshing just the semantic model highlighted in red end up refreshing the report? Shoudn't refreshing that refresh all of the data sources within that semantic model?

1 ACCEPTED SOLUTION

Hi @russellj @kblackburn 

Refreshing the semantic model doesn't trigger the refresh of the Dataflows therefore the data in the semantic model stays as it was if the dataflows weren't refreshed. That's why it is a must to refresh the dataflows then after all refreshes have terminated then trigger the refresh of the semantic model to get the up-to-date data.

Regards
Amine Jerbi

If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook

View solution in original post

7 REPLIES 7
russellj
Frequent Visitor

Some good discussion here, looks like it's basically as designed and my only option would be to schedule them individually to update in the correct order. For a manual refresh, dataflows one at a time followed by the semantic model will have to do it. The problem with this is that these are very large datasets, so it's kind of annoying to have to monitor them for completion before I manually start the next one. Oh well.

 

Thanks all for the replies.

lbendlin
Super User
Super User

this will be come very complex very quickly especially if you have multiple chained dependencies. You need to think of your problem from the other side. When the primary data source refreshes this should then cascade down the chain.

 

If this is important to you please consider voting for an existing idea or raising a new one at https://ideas.fabric.microsoft.com/?forum=2d80fd4a-16cb-4189-896b-e0dac5e08b41

kblackburn
Advocate I
Advocate I

Having the same issue come up for on demand refreshes with non power users so I am setting up a Power Automate flow that will run when the first data flow completes. Maybe the same would work for you?

Hi @russellj @kblackburn 

Refreshing the semantic model doesn't trigger the refresh of the Dataflows therefore the data in the semantic model stays as it was if the dataflows weren't refreshed. That's why it is a must to refresh the dataflows then after all refreshes have terminated then trigger the refresh of the semantic model to get the up-to-date data.

Regards
Amine Jerbi

If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook

correct, that is what I am doing in PA

You can use schedule refresh in Power BI instead of Power Automate, you just need to choos the good time schedules for both Dataflows then for Semantic models.

Regards
Amine Jerbi

If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook

The OP indicated that this was for a non scheduled refresh situation. For scheduled refreshes yes, we are using power bi.  Thanks

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors