Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
mds123
Frequent Visitor

Saving a daily copy of inventory table to lakehouse

I'm brand new to fabric and the data factory tools. I have an inventory table that shows the current invenotry for all my products. I'm looking to build a table that has a daily snapshot of the inventory. I'm assuming I could do this with incremental refresh but how do I make it so I don't delete historic information?

 

I'm brand new to all the Fabric / Data Factory tools so any help would be amazing.

1 ACCEPTED SOLUTION
miguel
Community Admin
Community Admin

It doesn't seem like you'd need incremental refresh.

 

You can just run the dataflow every day, add a column to have the timestamp and just append the result of every day execution to a "snapshots" table. Again, it doesn't seem like you need incremental refresh at all and just doing an Append operation would suffice.

View solution in original post

6 REPLIES 6
miguel
Community Admin
Community Admin

It doesn't seem like you'd need incremental refresh.

 

You can just run the dataflow every day, add a column to have the timestamp and just append the result of every day execution to a "snapshots" table. Again, it doesn't seem like you need incremental refresh at all and just doing an Append operation would suffice.

miguel
Community Admin
Community Admin

you don't really need incremental refresh for this. If your goal is to take a snapshot, then just run the Dataflow Gen2 perhaps every day and then append the data to a "snapshots" table. Each run would be the snapshot at a particular date. You can also add another column where you can set a timestamp as to when the process was executed.

 

Again, it doesn't appear that you need incremental refresh, but that feature will come soon as well.

frithjof_v
Skilled Sharer
Skilled Sharer

If you don't want to delete historic data, then you can use the Append mode. In this way, you just add new rows, without deleting old rows. I would add a snapshot timestamp column so you can keep track of which rows belong to which snapshot.

v-gchenna-msft
Community Support
Community Support

Hi @mds123 ,

Thanks for using Fabric Community.
If you looking got incremental load then you can refer these documents -
Incremental Load in Fabric
Pattern to incrementally amass data with Dataflow Gen2 - Microsoft Fabric | Microsoft Learn

FYI: In future release we provide the incremental refresh.

vgchennamsft_0-1713338317644.png


Inorder to know future release - What's new and planned for Data Factory in Microsoft Fabric - Microsoft Fabric | Microsoft Learn


 Hope this is helpful. Please let me know incase of further queries.

Hi @mds123 ,

We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .

Thanks

Hi @mds123 ,

We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .

Thanks

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

PBI_Carousel_NL_June

Fabric Community Update - June 2024

Get the latest Fabric updates from Build 2024, key Skills Challenge voucher deadlines, top blogs, forum posts, and product ideas.

MayFBCUpdateCarousel

Fabric Monthly Update - May 2024

Check out the May 2024 Fabric update to learn about new features.

Top Solution Authors