- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Fabric Data Factory Pipeline: Incremental load
How can I create an incremental load in Data Factory Pipeline?
I only see the Table Actions append or overwrite. But I want to be able to add new rows, update existing ones and mark deleted rows. I would expect that you can tell the wizard what the PK columns are and that this logic is created for you.
Is there a way to do this now? Or is this a feature that is coming later on?
Many thanks!
ITsmart BI and Analytics consultant
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Your table action options imply that you are using pipelines. If so, good news, we have just added a new feature that allows you to "upsert" the new rows into existing tables instead of append/overwrite only.
Its in private preview, please ping me with your workspace id, and we can enable it for you along with instructions. cc @makromer
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hi,
is it already known when the function will be generally available? I also need this function.
Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Your table action options imply that you are using pipelines. If so, good news, we have just added a new feature that allows you to "upsert" the new rows into existing tables instead of append/overwrite only.
Its in private preview, please ping me with your workspace id, and we can enable it for you along with instructions. cc @makromer
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello, is it possible to have this enabled for our capacity? It would help us a lot as we are planning on moving some of our enterprise data to Fabric from SQL DB.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, is it still possible to try this feature? I have set up incremental refresh with stored procedures etc, but want to have that setup for first step from on prem sql to lakehouse, and would not want to recreate everything again or leave full refreshes for prepared data from lakehouse to warehouse 🙂
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @ajarora,
do you know when this feature will be general available?
I would like to do upsert with copy data or dataflow instead of stored procedure.
Thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi ajora,
When will this action be available outside the private preview?
thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @ajarora,
Thanks for providing the information! I have a question about using Data Factory pipelines to upsert data from an on-premise SQL Server incrementally. Is there a method available for doing this?
I've managed to establish a connection through our Gateway to the Server and would like to use the DF Copy activity to query data and upsert it into a Datalakehouse delta table. However, I'm not seeing the created connection in the Copy activity wizard. Is there something I might be missing?
On the other hand, I can connect to it using Dataflows Gen2, apply a filter, and then use the Datalakehouse as the destination for data in append mode. The problem is, I'm not quite sure how to query only new records without loading the entire table initially.
Do you know if there's a recommended method to achieve this? Especially now during the preview period of Fabric, I'd appreciate any guidance you might have.
Thanks in advance for your help!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Unfortunately currently Gateway based connections are not supported in Pipelines. We are actively working on it. Until then, if you can use dataflows to stage your onprem data to a cloud store (say SQLDb) then use copy to UPSERT it to LH table, that would be a workaround.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi
I've pm'ed you our workspacec id. Would you be able to enable the preview feature? We're evaluating different options at the moment and it's one of the requirements to have incremental load.
Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hey!
At the moment, we don't have an incremental refresh feature in the same way that Dataflows Gen1 have them in something like Power BI. We are internally tracking this and I've also created an idea so you can vote for it and help us prioritize it:
https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=4814b098-efff-ed11-a81c-6045bdb98602
There are some alternatives or different approaches to incremental refresh depending on what logic your incremental refresh should follow by leveraging the existing capabilities with pipelines (for the orchestration component) and the output destinations using either an Append method or loading to a staging table and then running any sort of scripts that you'd like to run to make the incremental refresh happen through a notebook.
Helpful resources
Join us at the Microsoft Fabric Community Conference
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Microsoft Fabric Community Conference 2025
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
Subject | Author | Posted | |
---|---|---|---|
12-18-2024 10:42 PM | |||
11-26-2024 01:12 AM | |||
11-12-2024 05:24 AM | |||
01-30-2024 05:01 PM | |||
09-24-2024 08:58 AM |
User | Count |
---|---|
6 | |
2 | |
2 | |
2 | |
2 |
User | Count |
---|---|
11 | |
6 | |
6 | |
4 | |
4 |