Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
daircom
Resolver II
Resolver II

Fabric medallion: how to refresh tables which are only occasionaly updated?

Hi all,

In Fabric, I set up the following medallion structure for a table of which the data is only updated once in a while.

Upload to deltalake (bronze) --> notebook transforms and moves data to deltalake (silver) --> move data to datawarehouse (gold) via SQL.

My initial thought was to create a datapipeline in Fabric but this only can be schduled on a time trigger. But scheduling this daily (to make sure when changes occur that they occur accross all layers), is overkill ofcourse. There also is no such thing as Azure event grid. What is best practise in Fabric? I can imagine there are a lot of datasets which not not require a frequent update --> but do need to be updated once the data in bronze changed.

 

Kind regards,

Bas 

1 ACCEPTED SOLUTION
FabianSchut
Super User
Super User

Hi @daircom,

 

MS Fabric does have some more advanced triggers for azure data pipeline, but not in the scheduling feature. You should open the data pipeline and select 'Add trigger (preview)'. This has more options for triggering your data pipeline. I am not sure whether your specific usecase is included in the options, but you could check it out:
https://learn.microsoft.com/en-us/fabric/data-factory/pipeline-storage-event-triggers

View solution in original post

3 REPLIES 3
Anonymous
Not applicable

Hi @daircom 

 

How is the new data uploaded to the Bronze layer? In what form does it exist in the bronze layer? 

 

My idea is that you may add a pre-check step to check whether the data in the bronze layer is updated or not. It might be a Lookup activity or a Get Metadata activity or something else (e.g. Notebook activity) according to how the bronze layer is built. 

 

Then add an If Condition Activity to evaluate the result of the previous pre-check step. If the data in bronze layer has been updated, execute a child pipeline or follow-up activities to refresh data in silver and gold layers. If there is no update, do not execute any activity further. 

 

You can schedule the father data pipeline daily then. It will do the pre-check step first to decide whether to refresh the data or not. 

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

frithjof_v
Super User
Super User

What mechanism are you using to bring data into your bronze layer? Are you manually uploading it?

 

In that case, you could also manually trigger a pipeline after you have manually uploaded the new file. Semantic models can be refreshed by a pipeline, or also by a notebook using semantic-link which provides advanced refresh settings.

FabianSchut
Super User
Super User

Hi @daircom,

 

MS Fabric does have some more advanced triggers for azure data pipeline, but not in the scheduling feature. You should open the data pipeline and select 'Add trigger (preview)'. This has more options for triggering your data pipeline. I am not sure whether your specific usecase is included in the options, but you could check it out:
https://learn.microsoft.com/en-us/fabric/data-factory/pipeline-storage-event-triggers

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.