Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Prepping for a Fabric certification exam? Join us for a live prep session with exam experts to learn how to pass the exam. Register now.

Reply
Sreeja_G
Microsoft Employee
Microsoft Employee

Initial large data sets in semantic models is resulting in Resource Governing: Memory error

Hi,

 

1.

If I run DAX queries against my semantic model by directly loading large tables, it resulted in an error 

Resource Governing: Query has execeeded the memory limit

If I feed the data in smaller sets slower then it taken all the load at the end. Can you help me understand the reason behind it?

 

2.

Also, I started testing my dataset with an userprinciplename() with one testing alias..there after with continous testing, even if I don't mention any alias, if I directly run the DAX query, it default giving the results of my used alias eventhough I did not mention it, can you please help me understand this behavior?

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @Sreeja_G ,

As @lbendlin  said, there seems to be an ambiguity in the problem you are having. But I have given some suggestions in my own way of understanding:
Issue 1: Resource Governing: Query has execeeded the memory limit
Each capacity has an effective memory limit based on its SKU (e.g., P1, P2, P3), which specifies the maximum amount of memory that can be consumed by a single operation. This limit ensures fair resource allocation and prevents any single operation from monopolising capacity resources.

The reason you can load smaller datasets without hitting this limit is likely because smaller queries consume less memory, staying within the limits of available resources. When you try to load a large table all at once, the memory required for the operation exceeds the effective memory limit, triggering an error.

To mitigate this problem, consider the following steps:

  1. Optimise the data model to reduce the memory footprint. This may involve filtering unnecessary columns or rows, optimising data types or aggregating data at a higher level.
  2. Do not load large tables all at once. Use incremental loading to break the data load into smaller, manageable chunks.
  3. If you experience these limitations frequently, consider upgrading to a premium capacity SKU to increase the effective memory limit.

Issue 2: userprincipalname()
If you find that running a DAX query defaults to a previously tested alias, even if it is not explicitly mentioned, this may have something to do with how the Power BI service caches the query or user context to optimise performance. It is also possible that the test environment still retains some user context that affects subsequent queries.

To further diagnose this behaviour, consider:

  1. Clearing any caches in your Power BI environment.
  2. Ensure that your DAX queries explicitly define user contexts where necessary to avoid ambiguity.

Best Regards,

Ada Wang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

Hi @Sreeja_G ,

As @lbendlin  said, there seems to be an ambiguity in the problem you are having. But I have given some suggestions in my own way of understanding:
Issue 1: Resource Governing: Query has execeeded the memory limit
Each capacity has an effective memory limit based on its SKU (e.g., P1, P2, P3), which specifies the maximum amount of memory that can be consumed by a single operation. This limit ensures fair resource allocation and prevents any single operation from monopolising capacity resources.

The reason you can load smaller datasets without hitting this limit is likely because smaller queries consume less memory, staying within the limits of available resources. When you try to load a large table all at once, the memory required for the operation exceeds the effective memory limit, triggering an error.

To mitigate this problem, consider the following steps:

  1. Optimise the data model to reduce the memory footprint. This may involve filtering unnecessary columns or rows, optimising data types or aggregating data at a higher level.
  2. Do not load large tables all at once. Use incremental loading to break the data load into smaller, manageable chunks.
  3. If you experience these limitations frequently, consider upgrading to a premium capacity SKU to increase the effective memory limit.

Issue 2: userprincipalname()
If you find that running a DAX query defaults to a previously tested alias, even if it is not explicitly mentioned, this may have something to do with how the Power BI service caches the query or user context to optimise performance. It is also possible that the test environment still retains some user context that affects subsequent queries.

To further diagnose this behaviour, consider:

  1. Clearing any caches in your Power BI environment.
  2. Ensure that your DAX queries explicitly define user contexts where necessary to avoid ambiguity.

Best Regards,

Ada Wang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

lbendlin
Super User
Super User

not clear where you are doing this.  In Excel, in Power BI Desktop, in a dataflow?

Helpful resources

Announcements
PBIApril_Carousel

Power BI Monthly Update - April 2025

Check out the April 2025 Power BI update to learn about new features.

Notebook Gallery Carousel1

NEW! Community Notebooks Gallery

Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.

April2025 Carousel

Fabric Community Update - April 2025

Find out what's new and trending in the Fabric community.