Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.

Reply
PeKaDe
Advocate I
Advocate I

Incremental Refresh - is there a way to handle record updates?

Hi Community,

After some months of struggling to set up a working incremental refresh solution on my SQL DB source, I came to a sad conclusion that it can't be done. Power BI does not seem to be able to handle a scenario which is extremely common - record updates. I write this post to:

  • either get corrected (which I will very much appreciate as it will make my solution work)
  • or get a suggestion how to avoid full refresh every 24 hours

 

So my supposedily typical scenario is the following: 

  • I have a DB of persons with a main table called Persons
  • each person has a number of attributes, e.g. Date of Birth, Name, Gender, etc.
  • there is a proper unique key (ID of a person) and 2 important technical attributes: CreatedAt and ModifiedAt.
  • CreatedAt never changes obviously, while ModifiedAt gets updated every time something changes about a given peson (e.g. they correct their Date of Birth).

All this data (around 30M records, so not so few) is pulled to Power BI every night. Having heard about the incremental update feature, I proceeded to configure the following:

  • set up incremental update with RangeStart/RangeEnd, etc.
  • configured Power BI so that it looks at the last 7 days based on the ModifiedAt field

I assumed that Power BI will query DB to get all records modified in the last 7 days, and subsequently update the records it stores in its cache. Source is a SQL, there's a primary key, it all seemed natural. But I learned that this is not how Power BI actually works.

It removes 7 days worth of data from the local cache and replaces it with 7 days worth of data just queried from SQL. Primary key is never considered. Effectively it's creating duplicates. The original record is in the cache and the new updated record is added during the incremental refresh. So I'm getting key duplication errors and so on.

Am I correct so far in my understanding of how Power BI works?

 

If I am correct, is there a way to handle updates? I'm guessing that the update scenario must be super common so I struggle to imagine that this is simply not possible to configure/fix somehow.

 

For people who would suggest "Detect Data Changes" functionality, I'd like to say that I tried it out. I divided my 5 years of data to 60 monthly partitions and configured Power BI to get 60 months worth of data based on "CreatedAt" field but to detect data changes (in each partition) based on the "ModifiedAt" value. Unfortunately the nature of the data is that in the period of 24 hours there is an update of at least 1 record from each partition. So I'm effectively doing a full refresh this way.

 

Any ideas, People? 🙂 

Thanks in advance!

PKD

10 REPLIES 10
DanDensley
Frequent Visitor

Question - if I can be certain that where modifications occur they will happen within 10 days of creation then incremental refresh should work for me if I based on modified date and set the "Refresh rows from the past" setting to 10?

This is event data where occasionally events can be corrected and the modified date will be updated. So if I keep the past days setting to cover the potential correction period then I should be covered from the problems you have experienced.

Top of mind: yes I think so. In fact with incremental loading, be it in power bi or in a data warehouse, you just determine your sliding window. Yours is max 10 days ago. If you could test, then you are sure 🙂  

GrantM-NVS
New Member

Howdy OP!

 

I'm facing similar issues, but am using Knime - an ETL - a as a workaround. It's free, but you'll have to pay for a server license if you want to schedule workflows. Basically updating my SQL server with modified records and appending new records. My flow takes a few seconds and runs every 5 minutes. I'm pulling Salesforce data and unfortunately cannot use Power BI incremental refresh beacuse it appears it cannot handle modified records.

PaginatedDino
Helper I
Helper I

@PeKaa wow I arrived a bit late at this sad party 🙂

Here I was setting up incremental loading for big facts tables where a single fact could be updated after a period of time that exceeds the rolling period for the daily refresh. I end up with doubles like reported by you... 

This indeed seems like something basic that should just work based on the basic upsert/merge approach we are used to in data warehousing scenarios, but probably that would diminish the performance gain you would get with the partition set up. 

The only thing I could think of is that:

- I would run during the week the incremental processing causing some data quality issues but keeping an acceptable duration. 

- schedule for this specific type of table a full process of all partitions during the weekend. This would require, I think, Premium per User setup so that you could run xmla statements that ignore the refresh policy à la: 

{
"refresh": {
"type": "full",
"applyRefreshPolicy": false,
"objects": [
{
"database": "Model",
"table": "updateable_table"
}
]
}
}

 

Indeed the capture data change option can be interesting when only a portion of the partitions get touched because of an update, but wondering where the tipover point is on performance when you would just do a full reprocess. 

 

Did you find another way? 

Cheers

Hi @PaginatedDino , unfortunately not. I ended up optimizing my report which is currently allowing me to run full refreshes on a daily basis. The main change was to denormalize my model so that 2 tables (each 25M rows) have been combined into one. This made all the difference in the world, as previously I couldn't even refresh effectively (see my other thread here: https://community.powerbi.com/t5/Service/Never-ending-incremental-refresh-in-the-Service/m-p/2851829...).
This has all been quite depressing, so I don't want to hear about the incremental refresh any more, at least until they develop an update mode 🙂 I get your point about diminishing the performance gain, but on the other hand if you use a primary key and filter by an indexed date column, it should work pretty fast and achieve what is needed. But I guess we will never know... no one from MS ever answered any of my questions here on the forum and no one ever answered a feedback I sent to Power BI survey. Moreover I've worked on this with an MS consultant and they haven't been able to realize the problem for 3 months, until I found it myself by guessing how this whole thing works. Sorry about this depressing answer 😞 


Good luck with your refresh! 🙂 

Hey @PeKaDe thanks for the reply, sending virtual cheer up fist bump 😛 from time to time we bump in these kinds of "puddings". The main thing I take away from this is a scale from upate possibility to determines how interesting incremental loading could be: 

1-----------50-------------100 

1 > only data(typically events) that never gets updated hence perfect for incremental (partitioning)

50> partially data that never or does get updated beyond the configured rolling refresh period hence incremental loading + change data capture could work out

100) hightly volatile data where there is not predicatble scattering of when something gets updated after original insert and thus would cause maybe the change data capture to fetch all most of the historical partitions anyway.

 

Anyway good to hear it works and have a nice weekend

PeKaDe
Advocate I
Advocate I

Hi again, I'd love to hear from someone who had this issue and somehow managed to solve it. I can deliver some additional insights from comparison testing.
I used a test environment for these test, which means there's 20M records and data is not updated so often. Actually from 5 years worth of data (=60 months) only 1 monthly partition is modified each day and needs updating. 

The testing results shocked me and I'm on the verge of thinking that incremental refresh (detect data changes) is close to unusable. I'll be very happy to be corrected - that's the whole point of my posting here, to find a flaw in what I'm doing and a solution of course.

 

Case 1 - I configured full refresh and started it from a schedule. It took 9 minutes to refresh 20M records.

 

Case 2 - I configured incremental refresh as discussed before. I marked the "detect data changes" option and set Power BI to potentially refresh (if there were changes) 60 months into the past.

 

Surprise no. 1 - Initial full refresh took 53 minutes. That's the same full refresh that without this setting takes 9 minutes. I read the documentation at least twice and I understand that the only other thing happening there, is storing Max(ModifiedAt) for each of the 60 partitions.

 

Surprise no. 2 - Incremental refresh took 39 minutes. It needed to refresh just 1 monthly partition. That's more or less 2% of the data. We kept looking at the DB and couldn't see anything that would give us a clue on what has been happening.

Needless to say I have indexes on both CreatedAt & ModifiedAt. They have been rebuilt before the testing took place. The same queries that Power BI is running execute on the DB within a couple of seconds. 

Please help anyone 🙂 

Greg_Deckler
Super User
Super User

@PeKaDe So your understanding of how Power BI incremental refresh works is essentially correct. A partition is created based upon the range parameters. In your case, I would think you would want to set the number of days refreshed based upon the CreatedAt column for 1 day. This would create a partitition for each day in the dataset. That way, going forward with the Detect Changers based upon the ModifiedAt value it would only refresh the daily partititions for the particular days where data changed.

 

This also might be interesting reading for you: Advanced incremental refresh and real-time data with the XMLA endpoint in Power BI - Power BI | Micr...



Follow on LinkedIn
@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
Power BI Cookbook Third Edition (Color)

DAX is easy, CALCULATE makes DAX hard...

I've come across a handful of posts resulting in "you can't do that" or with Power Query workarounds that run the server out of memory first.  This post about how to handle retroactive updates with incremental refresh is very helpful.  Thank you for posting.  With the right indexes on the source, keeping partitions by month might be workable.   

Hi @Greg_Deckler , thank you for a quick answer. I'm glad that I understood correctly how the incremental refresh works. I find it really surprising as I don't think there is so many cases that Power BI's incremental refresh can cover without doing proper updates based on the key. Do you know if perhaps Microsoft is working on solving this?

If not, I'd say this should be very prominent in their documentation as a very important limitation of the solution.

Anyway, your suggested workaround couldn't work - Microsoft is supporting values from 1 to 120. 5 years = 60 months = 1800 days, so I can't really go with that.

Any other suggestions? Perhaps someone had a similar problem and solved it?

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

FebPBI_Carousel

Power BI Monthly Update - February 2025

Check out the February 2025 Power BI update to learn about new features.

Feb2025 NL Carousel

Fabric Community Update - February 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors