Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi all,
I was searching other posts to find solution but I couldn't find anything that I can apply.
I have a table with 40+ colums and over 1.5 milion rows, which are growing every day.
In addition, I have lots of quite complicated dashboards and related tables with calculated columns based on this source table, making data refresh process very long.
Since last few days, my Surface Pro decided to stop being able to process the entire refresh and I need to find solution to continue using entire data but only update data incrementally.
As my data is connected to folders, where I have daily CSVs, I tried to split those files into two separate folders, create identical table with historical data and then append it to core table. Unfortunately, by refreshing core table it is still trying to refresh both folders, making my memory to go 'puff'.
Any idea how I could solve this problem?
Thanks
Try turning off the relationship detection in the file. I think that might trigger a reload of all tables trying to find new relationships.
If you go into Edit Query on your historical and right click on it, you can turn off "Include in report refresh" which should stop it from re-pulling the data for that query.
Hi @jdbuchanan71 , unfortunately when refreshing the core table, which now is only connected to folder containing few CSVs, it is still going through all historical CSVs from the appended table, killing my memory.
I guess appending two tables will result in pulling all data everytime.
Is there any other way to solve this problem?
Sorry that isn't working. I tested on my side and saw the same behavior where the Append query forces a refresh on all the data. If you don't have a combine query then when you refresh the report it won't pull the old files. Would keeping them seperate and combining them in DAX be a possible solution?
I don't know how many measures you have but if the model is fairly straight forward, linking in the "Old" table to the lookup tables (customer, employee, date, etc) would be fairly simple then the measures just become.
Amount = SUM ( Old[Value] ) + SUM ( 'Current'[Value] )
Thanks for suggestion @jdbuchanan71 but I have too many measures and additional tables feeding already from the core table.
I was thinking about creating new table that is created from both tables, that way they become true source of data whilst the new table becomes battle ground for all measures. Not sure that will work but I will have a go.
Hi. Did you ever find a workaround to this problem? I am trying to fix the same issue
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 53 | |
| 42 | |
| 34 | |
| 34 | |
| 21 |
| User | Count |
|---|---|
| 143 | |
| 125 | |
| 100 | |
| 81 | |
| 63 |