Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi all,
In Fabric, I set up the following medallion structure for a table of which the data is only updated once in a while.
Upload to deltalake (bronze) --> notebook transforms and moves data to deltalake (silver) --> move data to datawarehouse (gold) via SQL.
My initial thought was to create a datapipeline in Fabric but this only can be schduled on a time trigger. But scheduling this daily (to make sure when changes occur that they occur accross all layers), is overkill ofcourse. There also is no such thing as Azure event grid. What is best practise in Fabric? I can imagine there are a lot of datasets which not not require a frequent update --> but do need to be updated once the data in bronze changed.
Kind regards,
Bas
Solved! Go to Solution.
Hi @daircom,
MS Fabric does have some more advanced triggers for azure data pipeline, but not in the scheduling feature. You should open the data pipeline and select 'Add trigger (preview)'. This has more options for triggering your data pipeline. I am not sure whether your specific usecase is included in the options, but you could check it out:
https://learn.microsoft.com/en-us/fabric/data-factory/pipeline-storage-event-triggers
Hi @daircom
How is the new data uploaded to the Bronze layer? In what form does it exist in the bronze layer?
My idea is that you may add a pre-check step to check whether the data in the bronze layer is updated or not. It might be a Lookup activity or a Get Metadata activity or something else (e.g. Notebook activity) according to how the bronze layer is built.
Then add an If Condition Activity to evaluate the result of the previous pre-check step. If the data in bronze layer has been updated, execute a child pipeline or follow-up activities to refresh data in silver and gold layers. If there is no update, do not execute any activity further.
You can schedule the father data pipeline daily then. It will do the pre-check step first to decide whether to refresh the data or not.
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
What mechanism are you using to bring data into your bronze layer? Are you manually uploading it?
In that case, you could also manually trigger a pipeline after you have manually uploaded the new file. Semantic models can be refreshed by a pipeline, or also by a notebook using semantic-link which provides advanced refresh settings.
Hi @daircom,
MS Fabric does have some more advanced triggers for azure data pipeline, but not in the scheduling feature. You should open the data pipeline and select 'Add trigger (preview)'. This has more options for triggering your data pipeline. I am not sure whether your specific usecase is included in the options, but you could check it out:
https://learn.microsoft.com/en-us/fabric/data-factory/pipeline-storage-event-triggers
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
10 | |
4 | |
4 | |
3 | |
3 |