Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
smpa01
Super User
Super User

Initial refresh to get past

I have a dataset containing massive data from 3 sources. I have set up IR on each of them,

 

However, when I initiate the 1st refresh (full refresh), I am getting the follwoing error

The database was evicted and the operation cancelled to load balance the CPU load on the node.Please try agian later.

I only want the intital refresh to get past cause I know I would never have to look back as IR is already factored in.

 

How can I get past the 1st refresh.

@GilbertQ  @bcdobbs @AlexisOlson 

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
5 REPLIES 5
bcdobbs
Super User
Super User

Morning, never experienced this but I think the following might work. Are you starting off life in PBI Desktop? If so configure the IR with a very narrow range that will only load a small amount of data and publish it to the service.

bcdobbs_0-1679650023723.png

It should hoepfully publish. At that point you need to force a refresh in the service so that it setups an initial policy and partitions:

bcdobbs_1-1679650460440.png

 

 

Then you need to connect to the XLMA end point with tabular editor and "modify". I've tried with TE 2 so that it's "free":

 

1) CLick IR table and in property pain you should see:

bcdobbs_2-1679650517394.png

 

2) Change the RollingWindowPeriods to what you need:

bcdobbs_3-1679650568772.png

 

3) Click save changes to connected database.

 

4) Right click the table and click "Apply Refresh Policy" which should generate partitions

bcdobbs_4-1679650718415.png

 

 

 

 

 

5) You can then refresh each parition in turn either from TE or SSMS.

 

Hoping that works!



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

This might also help:

Incremental Refresh | Tabular Editor Documentation



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!
collinq
Super User
Super User

Hey @smpa01 ,

 

I think that you will have to use XMLA endpoints to do the initial load.  (https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-connect-tools)  And, use the XMLA endpoint for your initial load with your incremental refresh - https://learn.microsoft.com/en-us/power-bi/connect-data/incremental-refresh-xmla

 

Basically, I am saying that you have to set up your incremental refresh, with the XMLA endpoint, with the partitions pre-set (maybe use SSMS or Tabular Editor??) and then the very first load IS an incremental load.

 




Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!
Private message me for consulting or training needs.




  • How Can you upload a dataset with pre-set partition when the initial refresh has not happened. In my understanding, unless the first refresh happens the partions are not created. If I can upload a datset with pre-set partition ( I dont know how that can happen when the first refresh has not happened) i can simply manually refresh the partions from ssms  before puttting it on scheduled refresh with IR factored in. I like your idea but I can't put the pieces together. @collinq 
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs

Hey @smpa01 ,

 

The way it works is that the first load is done incrementally, so you don't have to do a full refresh and then increment afterwards.  From a not-so-technical view, think of it as the partitions running one at a time and then appending themselves - but each partition goes into the queue by itself and so you can update millions of rows of data based on your partition (like months from years).  So, in the end, you have what you needed - all the years, but they came in one month at a time.

 

You can do this with a number of tools, SQL and Tabular Editor being the two most popular .This article lines it out pretty well when using Tabular Editor:

Incremental Refresh | Tabular Editor Documentation

 

FYI - Version 2 will work fine, you do not have to purchase V3 to do what I am proposing.

 

 




Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!
Private message me for consulting or training needs.




Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors