Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount.
Register nowThe Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.
Hi,
I am trying to create a historical data table which would keep old refreshes and add new ones to the data. Basically I am running a query to show open orders on a certain week and would like to keep the results i get from all previous weeks unchanged and just add the current week data whenever I refresh the query.
Example: on week 1, there are following open orders: order1, order2, order3. I use a timestamp of the refresh. Week2 I run the query again and I would like to keep the data from week1 exactly as it is and then add all the data that would be a result of the same query running - let's say order1 and order3 are still open but order2 closed already and there is a new order - order4 which is also open. I would like the data to show something like this(week column - getting from the timestamp of the refresh):
wk1 order1
wk1 order2
wk1 order3
wk2 order1
wk2 order3
wk2 order4
I've tried to set up an incremental refresh already but it does overwrite both the time stamp and the data itself.
Is there any way how to does but avoid using snapshots (creating historical files in power automate)? Maybe by using 2 tables where one (current status) appends into the other (historical data).
Thanks!
Solved! Go to Solution.
Hi @JakubV
to keep historical data, you will need to have your data to be stored somewhere.
Within the Microsoft Fabric environment you could achieve this using a data warehouse combined with Gen2 dataflows.
First you will want to create a warehouse. Here's how: Create a Warehouse - Microsoft Fabric | Microsoft Learn
After that you will set up your Gen2 dataflow within a workespace of your choice just like a standard dataflow, but additionally you will add a data destination in the bottom right corner of the dataflow interface. Just choose your warehouse as destination and every time you refresh the dataflow, the result of the table will be written inside the warehouse. While setting up the destination, you will want to use the "append" option.
Dataflow Gen2 data destinations and managed settings - Microsoft Fabric | Microsoft Learn
Hope this helps.
Cheers
Tim
Hi @JakubV
to keep historical data, you will need to have your data to be stored somewhere.
Within the Microsoft Fabric environment you could achieve this using a data warehouse combined with Gen2 dataflows.
First you will want to create a warehouse. Here's how: Create a Warehouse - Microsoft Fabric | Microsoft Learn
After that you will set up your Gen2 dataflow within a workespace of your choice just like a standard dataflow, but additionally you will add a data destination in the bottom right corner of the dataflow interface. Just choose your warehouse as destination and every time you refresh the dataflow, the result of the table will be written inside the warehouse. While setting up the destination, you will want to use the "append" option.
Dataflow Gen2 data destinations and managed settings - Microsoft Fabric | Microsoft Learn
Hope this helps.
Cheers
Tim
Thanks. With the several possible setups I ran this would be the only solution. I will either be ruuning a solution creating external data files with the help of Power Automate (and appending the result files) or will have a SQL database created with a set of historical and temporary (current data run) tables.
Sounds good. If you need any further advice, don't hesitate to ask.
User | Count |
---|---|
45 | |
26 | |
21 | |
18 | |
18 |
User | Count |
---|---|
52 | |
45 | |
24 | |
24 | |
21 |