Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Hi everyone,
I'm working with Dynamics 365 Finance & Operations and using Synapse Link to export data to an ADLS data lake in Delta Lake format. In Microsoft Fabric, I use shortcuts to reference these external Delta tables and I'm planning to implement a Medallion architecture using Lakehouses, notebooks and pipelines.
I need to refresh the data 5 times a day, and I’d like to always keep the latest data from D365 in my Bronze layer. However, I’m facing challenges related to incremental loading and resource usage optimization:
CDF (Change Data Feed) doesn’t seem to be enabled or supported on these external Delta tables stored in the ADLS datalake and cannot change it since it's handled by Synapse. So without this I'm a bit lost how to move only the new and modified records into the bronze layer tables.
I don’t want to overwrite the Bronze table on each refresh (which could be a simple but not very best practice solution); I’d prefer to incrementally merge changes (including upserts/deletes) without full reloads while also keep records of the changes.
I also considered using Delta Lake time travel (e.g., comparing previous and current versions), but it seems that certain versions are not avalilable anymore in the datalake when I'm trying to access those.
To re-summarize what I need:
Keep only the most up-to-date data in my Bronze layer (not just append but updates/delets too)
Avoid full snapshot copies in Fabric (due to cost and inefficiency)
Store changes in a separate table to retain deltas if needed later
My question is:
What’s the best practice for implementing incremental updates from external Delta tables into Fabric Lakehouse, especially for D365 via Synapse Link?
Thanks in advance for any insights or experience you can share!
Solved! Go to Solution.
Hi @PDG_VL To implement incremental updates from external Delta tables into a Fabric Lakehouse Bronze layer, use a timestamp column to filter new or modified rows since the last refresh. Perform upserts using Delta Lake’s MERGE functionality to handle updates and deletes efficiently. Maintain a change log table for historical tracking, appending only new records during each refresh. Optimize by partitioning Delta tables and processing only relevant partitions to reduce resource usage.
Hi @PDG_VL
Thank you for using Microsoft Community Forum.
Thank you.
Hi @PDG_VL To implement incremental updates from external Delta tables into a Fabric Lakehouse Bronze layer, use a timestamp column to filter new or modified rows since the last refresh. Perform upserts using Delta Lake’s MERGE functionality to handle updates and deletes efficiently. Maintain a change log table for historical tracking, appending only new records during each refresh. Optimize by partitioning Delta tables and processing only relevant partitions to reduce resource usage.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
58 | |
56 | |
56 | |
38 | |
29 |
User | Count |
---|---|
75 | |
62 | |
45 | |
40 | |
39 |