Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The ultimate Microsoft Fabric, Power BI, Azure AI & SQL learning event! Join us in Las Vegas from March 26-28, 2024. Use code MSCUST for a $100 discount. Register Now

Reply
fhdalsafi
Frequent Visitor

Incremetnal refresh problem

Hello, 

 

I am having problem with the refresh, the error is shown in the picture below.

fhdalsafi_0-1642935781501.png

 

It says that the database size before execution is 14.2GB.

 

When I opened Dax studio it shows that the dataset size is 12.6GB ( there are a couple of excel sheets also connected but they dont make up alot of size).

fhdalsafi_1-1642935929926.png

 

I have Incremental refresh set up to refresh only a small partition of the data.

My question is why is the datasize before execution is the same as the full dataset size, its like the incremental refresh is not taking part of the data to refresh but instead its taking the whole dataset.

 

Appreciate any assistance.

 

Thanks in advance.

 

1 ACCEPTED SOLUTION
v-rongtiep-msft
Community Support
Community Support

Hi @fhdalsafi ,

First, please make sure the increment refresh can work. Incremental refresh is designed for data sources that support query folding, most data source(like flat file, SQL and other  relational data sources etc.) support query folding. Could you please tell me that what is the data source you are using? Please refer to the following document to determine if you have configured it correctly.

Configure incremental refresh and real-time data 

 

Typically, the effective memory limit for a command is calculated on the memory allowed for the dataset by the capacity (25 GB, 50 GB, 100 GB) and how much memory the dataset is already consuming when the command starts executing. For example, a dataset using 12 GB on a P1 capacity allows an effective memory limit for a new command of 13 GB. However, the effective memory limit can be further constrained by the DbPropMsmdRequestMemoryLimit XMLA property when optionally specified by an application. Using the previous example, if 10 GB is specified in the DbPropMsmdRequestMemoryLimit property, then the commands effective limit is further reduced to 10 GB.

 

To potentially reduce exceeding the effective memory limit:

  • Upgrade to a larger Premium capacity (SKU) size for the dataset.
  • Reduce the memory footprint of your dataset by limiting the amount of data loaded with each refresh.
  • For refresh operations through the XMLA endpoint, reduce the number of partitions being processed in parallel. Too many partitions being processed in parallel with a single command can exceed the effective memory limit.

 

More details: Resource governing command memory limit in Premium Gen 2 

 

Best Regards

Community Support Team _ Polly

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

2 REPLIES 2
v-rongtiep-msft
Community Support
Community Support

Hi @fhdalsafi ,

First, please make sure the increment refresh can work. Incremental refresh is designed for data sources that support query folding, most data source(like flat file, SQL and other  relational data sources etc.) support query folding. Could you please tell me that what is the data source you are using? Please refer to the following document to determine if you have configured it correctly.

Configure incremental refresh and real-time data 

 

Typically, the effective memory limit for a command is calculated on the memory allowed for the dataset by the capacity (25 GB, 50 GB, 100 GB) and how much memory the dataset is already consuming when the command starts executing. For example, a dataset using 12 GB on a P1 capacity allows an effective memory limit for a new command of 13 GB. However, the effective memory limit can be further constrained by the DbPropMsmdRequestMemoryLimit XMLA property when optionally specified by an application. Using the previous example, if 10 GB is specified in the DbPropMsmdRequestMemoryLimit property, then the commands effective limit is further reduced to 10 GB.

 

To potentially reduce exceeding the effective memory limit:

  • Upgrade to a larger Premium capacity (SKU) size for the dataset.
  • Reduce the memory footprint of your dataset by limiting the amount of data loaded with each refresh.
  • For refresh operations through the XMLA endpoint, reduce the number of partitions being processed in parallel. Too many partitions being processed in parallel with a single command can exceed the effective memory limit.

 

More details: Resource governing command memory limit in Premium Gen 2 

 

Best Regards

Community Support Team _ Polly

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

aj1973
Community Champion
Community Champion

Hi @fhdalsafi 

The message says it all. Do you have the permission to control the Premium Capacity? otherwise you need to reduce the size of your dataset. To answer your question; did you set up the incremental refresh correctly? did you apply it on the fact table?

Regards
Amine Jerbi

If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook

Helpful resources

Announcements
Fabric Community Conference

Microsoft Fabric Community Conference

Join us at our first-ever Microsoft Fabric Community Conference, March 26-28, 2024 in Las Vegas with 100+ sessions by community experts and Microsoft engineering.

February 2024 Update Carousel

Power BI Monthly Update - February 2024

Check out the February 2024 Power BI update to learn about new features.

Fabric Career Hub

Microsoft Fabric Career Hub

Explore career paths and learn resources in Fabric.

Fabric Partner Community

Microsoft Fabric Partner Community

Engage with the Fabric engineering team, hear of product updates, business opportunities, and resources in the Fabric Partner Community.

Top Solution Authors
Top Kudoed Authors