Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hi!
Have a client that is implementing Fabric in the lowest capacity possible due to budget constraints. To try to maintain costs under control we are trying to find the best practices or best way to bring data from Lakehouse into the Warehouse that will minimize CU usage, can be scheduled, needs to have incremental refresh and uses SQL to get data.
We've tried a bit with a Copy Activity and SQL queries inside this activity, which achieves most of the requirements but it seems that CU usage is still high.
Any recommendations? We are thinking about trying stored procedures and schedule it through a pipeline...any other ideas?
We have not considered mirroring because most of the tables have no column fro date / time from source systems.
Solved! Go to Solution.
@JulianaMacedo If your goal is to load data into a Warehouse from a Lakehouse incrementally and create a separate materialized dataset as opposed to what @KevinChant suggested as a dynamic view reading data directly from the Lakehouse, then you need to use a SQL MERGE query. That would be, to my best knowledge, the most efficient way of incremental load into a Warehouse. To run such a query the easiest way is to use a Factory data pipeline with a Script activity and code SQL right in the activity. I bit more involved way is to create a stored procedure in the Warehouse and use a Stored Procedure activity instead. You can trigger such a pipeline using OneLake data event trigger that will occur when any table in the Lakehouse is modified which is determined by a new file created in the delta log folder for that table.
Hope that helps.
Hi @sam51,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
Hi yes it has helped.
Hi @JulianaMacedo,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
@JulianaMacedo If your goal is to load data into a Warehouse from a Lakehouse incrementally and create a separate materialized dataset as opposed to what @KevinChant suggested as a dynamic view reading data directly from the Lakehouse, then you need to use a SQL MERGE query. That would be, to my best knowledge, the most efficient way of incremental load into a Warehouse. To run such a query the easiest way is to use a Factory data pipeline with a Script activity and code SQL right in the activity. I bit more involved way is to create a stored procedure in the Warehouse and use a Stored Procedure activity instead. You can trigger such a pipeline using OneLake data event trigger that will occur when any table in the Lakehouse is modified which is determined by a new file created in the delta log folder for that table.
Hope that helps.
Are they in the same workspace? if so, instead of copyng the data over and duplicating the data create a view in the Warehouse that points to the lakehouse table.
Yes they are in the same workspace.
But is it possible to schedule and do incremental load with a View? My knowledge in that is probably limited, but how whould one do that with a View?
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 2 | |
| 2 | |
| 1 | |
| 1 | |
| 1 |
| User | Count |
|---|---|
| 9 | |
| 4 | |
| 3 | |
| 3 | |
| 3 |