Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.

Reply
jaryszek
Memorable Member
Memorable Member

Error fetching data

Hello,

I got :

jaryszek_1-1758707905774.png



report was working yestarday, today I can not open anything beacuse i have not loaded visuals. 
What to do? 

DataSet is OneLake dataset. 

Best,
Jacek

1 ACCEPTED SOLUTION
v-achippa
Community Support
Community Support

Hi @jaryszek,

 

Thank you for reaching out to Microsoft Fabric Community.

 

Your dataset query is failing because it needs more memory i.e 38 GB than the current workspace capacity allows only 25 GB limit.

  • Please move the workspace to a Fabric or Premium capacity with higher memory limits, or scale up your existing capacity.
  • If not you will need to reduce the dataset size like for example by removing unused columns or enabling incremental refresh.

 

Thanks and regards,

Anjan Kumar Chippa

View solution in original post

4 REPLIES 4
jaryszek
Memorable Member
Memorable Member

thank you but i need to find what is causing it. 

Can somebody advice how to check size and query or measure which is causing the case? 

I tried with Vertipaq Analyzer from SQLBI but there is no size...this is onelake model. 
https://community.fabric.microsoft.com/t5/Data-Warehouse/Determine-size-of-semantic-model/m-p/428104... 

Anybody manage to find a semantic model on onelake size? 

I removed added measure which was:

% of Total Quantity AllSelected	"=
VAR Denom =
    CALCULATE ( [Total Quantity], ALLSELECTED ( Fct_EA_AmortizedCosts ) )
RETURN
    DIVIDE ( [Total Quantity], Denom, 0 )"


Nothing difficult...
but it caused my report failing...
I need to :

1. Check semantic model size after loading from onelake and after running queries
2. Identify what is so slow that caused report failure. 

Best,
Jacek

Hi @jaryszek,

 

Since this is a OneLake(DirectLake) semantic model, you can’t see table sizes in Vertipaq Analyzer like the Import models.

  • To check model size, look at the underlying lakehouse or warehouse tables, their row counts and file sizes reflect the semantic model size.
  • To find which query or measure is causing issues, use Performance Analyzer in power bi desktop to see which visual or query runs long and check the Fabric Capacity Metrics app for any memory spikes.
  • Your % of Total Quantity measure with ALLSELECTED can scan the entire fact table, which can cause the memory spike. Try testing with filters or try replacing ALLSELECTED with a lighter one.

 

Thanks and regards,

Anjan Kumar Chippa

Hi @jaryszek,

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution I have provided for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

v-achippa
Community Support
Community Support

Hi @jaryszek,

 

Thank you for reaching out to Microsoft Fabric Community.

 

Your dataset query is failing because it needs more memory i.e 38 GB than the current workspace capacity allows only 25 GB limit.

  • Please move the workspace to a Fabric or Premium capacity with higher memory limits, or scale up your existing capacity.
  • If not you will need to reduce the dataset size like for example by removing unused columns or enabling incremental refresh.

 

Thanks and regards,

Anjan Kumar Chippa

Helpful resources

Announcements
October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.