Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
mds123
Frequent Visitor

Saving a daily copy of inventory table to lakehouse

I'm brand new to fabric and the data factory tools. I have an inventory table that shows the current invenotry for all my products. I'm looking to build a table that has a daily snapshot of the inventory. I'm assuming I could do this with incremental refresh but how do I make it so I don't delete historic information?

 

I'm brand new to all the Fabric / Data Factory tools so any help would be amazing.

1 ACCEPTED SOLUTION
miguel
Community Admin
Community Admin

It doesn't seem like you'd need incremental refresh.

 

You can just run the dataflow every day, add a column to have the timestamp and just append the result of every day execution to a "snapshots" table. Again, it doesn't seem like you need incremental refresh at all and just doing an Append operation would suffice.

View solution in original post

6 REPLIES 6
miguel
Community Admin
Community Admin

It doesn't seem like you'd need incremental refresh.

 

You can just run the dataflow every day, add a column to have the timestamp and just append the result of every day execution to a "snapshots" table. Again, it doesn't seem like you need incremental refresh at all and just doing an Append operation would suffice.

miguel
Community Admin
Community Admin

you don't really need incremental refresh for this. If your goal is to take a snapshot, then just run the Dataflow Gen2 perhaps every day and then append the data to a "snapshots" table. Each run would be the snapshot at a particular date. You can also add another column where you can set a timestamp as to when the process was executed.

 

Again, it doesn't appear that you need incremental refresh, but that feature will come soon as well.

frithjof_v
Super User
Super User

If you don't want to delete historic data, then you can use the Append mode. In this way, you just add new rows, without deleting old rows. I would add a snapshot timestamp column so you can keep track of which rows belong to which snapshot.

Anonymous
Not applicable

Hi @mds123 ,

Thanks for using Fabric Community.
If you looking got incremental load then you can refer these documents -
Incremental Load in Fabric
Pattern to incrementally amass data with Dataflow Gen2 - Microsoft Fabric | Microsoft Learn

FYI: In future release we provide the incremental refresh.

vgchennamsft_0-1713338317644.png


Inorder to know future release - What's new and planned for Data Factory in Microsoft Fabric - Microsoft Fabric | Microsoft Learn


 Hope this is helpful. Please let me know incase of further queries.

Anonymous
Not applicable

Hi @mds123 ,

We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .

Thanks

Anonymous
Not applicable

Hi @mds123 ,

We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .

Thanks

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.