Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
Sreeja_G
Employee
Employee

Initial large data sets in semantic models is resulting in Resource Governing: Memory error

Hi,

 

1.

If I run DAX queries against my semantic model by directly loading large tables, it resulted in an error 

Resource Governing: Query has execeeded the memory limit

If I feed the data in smaller sets slower then it taken all the load at the end. Can you help me understand the reason behind it?

 

2.

Also, I started testing my dataset with an userprinciplename() with one testing alias..there after with continous testing, even if I don't mention any alias, if I directly run the DAX query, it default giving the results of my used alias eventhough I did not mention it, can you please help me understand this behavior?

1 ACCEPTED SOLUTION
v-yifanw-msft
Community Support
Community Support

Hi @Sreeja_G ,

As @lbendlin  said, there seems to be an ambiguity in the problem you are having. But I have given some suggestions in my own way of understanding:
Issue 1: Resource Governing: Query has execeeded the memory limit
Each capacity has an effective memory limit based on its SKU (e.g., P1, P2, P3), which specifies the maximum amount of memory that can be consumed by a single operation. This limit ensures fair resource allocation and prevents any single operation from monopolising capacity resources.

The reason you can load smaller datasets without hitting this limit is likely because smaller queries consume less memory, staying within the limits of available resources. When you try to load a large table all at once, the memory required for the operation exceeds the effective memory limit, triggering an error.

To mitigate this problem, consider the following steps:

  1. Optimise the data model to reduce the memory footprint. This may involve filtering unnecessary columns or rows, optimising data types or aggregating data at a higher level.
  2. Do not load large tables all at once. Use incremental loading to break the data load into smaller, manageable chunks.
  3. If you experience these limitations frequently, consider upgrading to a premium capacity SKU to increase the effective memory limit.

Issue 2: userprincipalname()
If you find that running a DAX query defaults to a previously tested alias, even if it is not explicitly mentioned, this may have something to do with how the Power BI service caches the query or user context to optimise performance. It is also possible that the test environment still retains some user context that affects subsequent queries.

To further diagnose this behaviour, consider:

  1. Clearing any caches in your Power BI environment.
  2. Ensure that your DAX queries explicitly define user contexts where necessary to avoid ambiguity.

Best Regards,

Ada Wang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

View solution in original post

2 REPLIES 2
v-yifanw-msft
Community Support
Community Support

Hi @Sreeja_G ,

As @lbendlin  said, there seems to be an ambiguity in the problem you are having. But I have given some suggestions in my own way of understanding:
Issue 1: Resource Governing: Query has execeeded the memory limit
Each capacity has an effective memory limit based on its SKU (e.g., P1, P2, P3), which specifies the maximum amount of memory that can be consumed by a single operation. This limit ensures fair resource allocation and prevents any single operation from monopolising capacity resources.

The reason you can load smaller datasets without hitting this limit is likely because smaller queries consume less memory, staying within the limits of available resources. When you try to load a large table all at once, the memory required for the operation exceeds the effective memory limit, triggering an error.

To mitigate this problem, consider the following steps:

  1. Optimise the data model to reduce the memory footprint. This may involve filtering unnecessary columns or rows, optimising data types or aggregating data at a higher level.
  2. Do not load large tables all at once. Use incremental loading to break the data load into smaller, manageable chunks.
  3. If you experience these limitations frequently, consider upgrading to a premium capacity SKU to increase the effective memory limit.

Issue 2: userprincipalname()
If you find that running a DAX query defaults to a previously tested alias, even if it is not explicitly mentioned, this may have something to do with how the Power BI service caches the query or user context to optimise performance. It is also possible that the test environment still retains some user context that affects subsequent queries.

To further diagnose this behaviour, consider:

  1. Clearing any caches in your Power BI environment.
  2. Ensure that your DAX queries explicitly define user contexts where necessary to avoid ambiguity.

Best Regards,

Ada Wang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

lbendlin
Super User
Super User

not clear where you are doing this.  In Excel, in Power BI Desktop, in a dataflow?

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.

LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.