Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
RBRADY
New Member

Salesforce Dataflow "invalid query locator" error only with incremental refresh enabled

Hello all,

 

    Been bashing my head against this problem for weeks now. Recently got Premium per Capacity licensing and have been trying to take advantage of Incremental Refresh wherever possible. We have several large objects that take upwards of 3-4 hours to refresh in many cases. In my most recent attempts, I have taken a single object - "Work Detail" - and performed zero manipulation (also have done filtering down to lines created for my division only since 1/1/2021). This will usually refresh successfully in several hours, but occasionally times out. Enabling Incremental Refresh causes the refresh to fail in either 14 seconds or 45 seconds with the "Invalid Query Locator" error, depending on if "Detect Data Changes" is enabled. We have a total of 5 objects where this is happening, but we have another 5 objects where it works without issues.

 

    I have a tried many different API Versions (52, 51, 50, 44. 42, 40, 30, and 29 at least), CreateNavigationProperties = True, with or without Detect Data Changes, doing a regular refresh before enabling incremental.... all yield the same result. Is there any way to manually limit the number of query locators running at one time (I am assuming that incremental refresh throws out one locator for X amount of rows, which is why this is working on the smaller obejcts)? My thinking is that if I can get the initial refresh to run, I can get the following incremental refreshes to execute without a hitch as my organization isn't creating that many new rows of data on a daily basis. Am I barking up the wrong tree here? Thank you in advance for your help!

4 REPLIES 4
moldepc
Frequent Visitor

I am having the same issue.  If I do an incremental refresh on smaller Salesforce objects, it works just fine.  If I try it on larger ones doing exactly the same process, it fails with the same error.  Did you figure out how to handle this yet?

Sort of. It's clunky, but I have essentially created three levels of dataflows:

1) the first level is a separate dataflow for each object (Work Details, Installed Product, Work Orders, and so on all have their own dataflows). In my case, I have about 6 dataflows. 
1.b) make sure to include the "Last Modified Date" field in each object, and your last step before saving the dataflow needs to be to sort decrementally by the Last Modified Date field during initial setup. That part is VERY important!

1.b) refresh each dataflow as you go before setting up the next one, enabled "detect data changes."

 

2) the initial collater: I had to setup two separate DFs during initial testing, but you can do a single one. In this one, you can also setup incremental refresh, but you HAVE to delete duplicate rows by ID inside the DF. BI will create duplicate records at some point rather than keeping the newest data if you do not do this. Ask me how I know...

 

3) the final consumption DF - this one is honestly not all that necessary, but I found performance wise it works better in my use-case. After the first two levels collect and prep the data, my last, single DF manipulates all of that data to get it ready for use in my reports- renaming stuff, adding custom columns, etc. 

 

lastly, I have the first level of DFs refreshing every hour (and see about four failures a day due to timeouts), and the second refreshes twice a day. The third one only refreshes once per day. 

This is a very clunky workaround, but it works, and is surprisingly far more reliable and faster than doing full refreshes every time. This is all temporary, though, as we will be switching over to an Azure warehouse at some point in the near future, at which point at least level 1 will disappear, if not 2 as well.

 

Good luck!

v-easonf-msft
Community Support
Community Support

Hi, @RBRADY 

Please check this related thread if it helps.

Salesforce Object (Dataflow Entity) fails refresh when Incremental Refresh is enabled

 

Best Regards,
Community Support Team _ Eason

thank you for your response, but unfortunately that thread doesn't help. The solution there was "15 minutes isn't bad, so don't use incremental refresh." My situation is hours longer if it completes, and overall is unreliable. In my case, we upgrade to premium largely because of the incremental refresh capabilities and now it seems we can't use it to serve our largest reporting data source. 

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors