Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.

Reply
lemaribdb
Helper II
Helper II

Orchestrating dataflow & semantic model refresh, warehouse/lakehouse

Hello guys, quick question.

Currently I have few dataflows (One pulls FACT tables, one DIM tables, one some custom local data, I have separated the concerns into few dataflows). Then I have my semantic model that pulls from all the dataflows, creates relationships, adds measures, this is my golden model that all my reports pull from.
I wanted to level higher. Having to guesstimate how long the dataflow refreshes take to then schedule the semantic model refresh is a bit silly.

I know the next step is prolly using fabric objects.
Can someone advise which object would help me with above case? To either orchestrate the refreshes (I cannot use Power Automate) or make all this data into one warehouse/lakehouse that refreshes as one? Are there any pros/cons of "complicating" my setup with Fabric objects?

2 REPLIES 2
JB81
Helper II
Helper II

Hi

 

This really depends (oh another it depends!!) on several factors - budget, time and experience.

 

My take is that Dataflows are fine (afaik these are M Code data transforms ported out of Power BI and into Fabric?) if you limited data modelling skills in upstream languages, namely SQL.

 

Are you asking about Architecture for a datawarehouse or how to just orcestrate dataflows and dataset* refreshes?

 

Regards

JB

 

* Yes I know they are now called Semantic Models but you'd have thought they'd have called them Semantic Models first then moved to the Marketing name of Data Sets.  Although other technologies also use the term "semantic models" to capture the relationships between schema and process so perhap it makes sense.

 

Hi, thanks for the reply,

The dataflows mainly pull from Oracle DB, some with SQL, some with the PBI Service GUI/M Code.
The dataflows are fine that's true, but I cannot orchestrate a nice refresh chain.

My main question is how to orchestrate the refresh of it all (when dataflows complete, refresh the semantic model). I jumped straight to Lakehouse/Warehouse/Gen2 Dataflows, as afaik, that's the only way to do the refresh orchestration properly (there is also Power Automate that I cannot use).
So I believe the best option may be transforming the Dataflows to Gen2 Dataflows, creating Lakehouse that gets data from those, create custom semantic model from this Lakehouse, do all the things I normally do in semantic model there and it will basically be the same solution I have now but allowing me to somehow refresh them all at once?

 

Helpful resources

Announcements
September Power BI Update Carousel

Power BI Monthly Update - September 2025

Check out the September 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors