Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
silcambro
Frequent Visitor

Refresh on Dataflows using your own Azure datalake

Hello,
 
I have a very simple question. If i setup a Dataflow using my own Datalake do i need to schedule the refresh?
 
Just to be more specific If I create a Data flow with the following options:
 
Attach a Common Data Model folder (preview)
 
is there a need to refresh the Dataflow? Or i just change the CDM and the dataflow will reflect the changes?
 
Thank you so much

2 REPLIES 2
Anonymous
Not applicable

@silcambro 
The solution is Yes, you need to set schedule refresh. However, the schedule refresh is refreshing the change of the datasources that are added to the dataflow. In fact, Attach a Common Data Model folder as dataflow is just change the entities in the dataflow, you still need to refresh the data.


Paul Zheng _ Community Support Team
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly

Thank you for your reply!

 

I will add files to my CDM folders and change the data through Azure Data Factory.

 

Why do i need to refresh? Will the refresh copy the my CDM folders Data to Azure Datalake managed by Microsoft?

 

Thank you so much

 

 

 

 

 

 

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.