Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Score big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount

Reply
Kazanskyi
Helper I
Helper I

How to update all Fact and DIM tables in Fabric at the same moment

I am working on moving the current Power BI dataset to the Fabric.

The data source is an on-premise SQL data warehouse that refreshes multiple times per day, and refresh is usually over during working hours.

 

I am copying tables using a pipeline from the warehouse to Fabric, and Fact_Tables and DIM tables take different amounts of time to load.

As far as I understand, Power BI reads from the latest version of the table in Fabric, and data is available immediately after the table is updated.

This means that users may get the wrong data at some point during the day because the DIM and FACT tables are not updated synchronically.

 

How to achieve the same full-package delivery as the Power BI dataset?

 

 

1 ACCEPTED SOLUTION
tackytechtom
Super User
Super User

Hi @Kazanskyi ,

 

I happened to have written a blog article about this exact scenario. Feel free to check it out:

https://www.tackytech.blog/how-to-refresh-your-direct-lake-semantic-model-from-fabric-data-pipelines...

 

In this blog post, we disable the setting which keeps the Direct lake data up to date. After, we trigger a "refresh" (or in this case it's rather referred to as reframing) from a fabric data pipeline. There is even a new dedicated pipeline activity for this matter, too.

 

Let me know if this helps you or in case you have questions along the way 🙂

 

/Tom
https://www.tackytech.blog/
https://www.instagram.com/tackytechtom/



Did I answer your question➡️ Please, mark my post as a solution ✔️

Also happily accepting Kudos 🙂

Feel free to connect with me on LinkedIn! linkedIn

#proudtobeasuperuser 

View solution in original post

1 REPLY 1
tackytechtom
Super User
Super User

Hi @Kazanskyi ,

 

I happened to have written a blog article about this exact scenario. Feel free to check it out:

https://www.tackytech.blog/how-to-refresh-your-direct-lake-semantic-model-from-fabric-data-pipelines...

 

In this blog post, we disable the setting which keeps the Direct lake data up to date. After, we trigger a "refresh" (or in this case it's rather referred to as reframing) from a fabric data pipeline. There is even a new dedicated pipeline activity for this matter, too.

 

Let me know if this helps you or in case you have questions along the way 🙂

 

/Tom
https://www.tackytech.blog/
https://www.instagram.com/tackytechtom/



Did I answer your question➡️ Please, mark my post as a solution ✔️

Also happily accepting Kudos 🙂

Feel free to connect with me on LinkedIn! linkedIn

#proudtobeasuperuser 

Helpful resources

Announcements
August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.