Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.
I encounter this ram issue when load my dashboard
Error fetching data for this visualResource Governance: We cannot complete the requested operation because there isn't enough memory (consumed memory 8548 MB, memory limit 3072 MB).
Please check the technical details for more information. If you contact support, please provide these details.
See details
Was this error message helpful?
Is there any way to monitor how much ram usage by semantic model and other running task ? I'm thinking about seperate page on 1 report file into multiple report file, does this help to solve the problem, and any solution to deal with this ram problem ?
Solved! Go to Solution.
Hi @minhnhatdanchoi ,
Consider the GYC video as starter: Reframe your Power BI Direct Lake Semantic Model
and check the post as well
Solved: Re: How to do incremental refresh using datalake o... - Microsoft Fabric Community
If this response was helpful in any way, I’d gladly accept a 👍much like the joy of seeing a DAX measure work first time without needing another FILTER.
Please mark it as the correct solution. It helps other community members find their way faster (and saves them from another endless loop 🌀.
Hi @minhnhatdanchoi ,
If I were you, I would do the following.
You basically have two options: either increase the Fabric capacity or optimize your data model.
Without knowing your data model, it is of course difficult to make recommendations.
But this is what I would do, or rather, how I would proceed.
- Check with the Metrics app, as already mentioned
- Use DAX Studio to identify large tables and columns. You probably already know most of this.
- Create aggregations. This should allow you to achieve the greatest memory reduction.
- Remove history from the model.
So my recommendation is that you should revise your data model. You should also consider whether Direct Lake is the right choice, as unfortunately it is not always suitable.
In the long term, you will probably need to increase your Fabric capacity.
You could also use the Fabric Estimator. It's still in preview, but it gives some good pointers.
https://www.microsoft.com/en-us/microsoft-fabric/capacity-estimator
I hope I was able to help you a little.
Best regards
Hi @minhnhatdanchoi,
Can you provide some more details? How big is your dataset, are we talking millions of rows or hundreds of millions?
What size capacity are you using?
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Environment & scale
Lakehouse: ~20 tables total.
Fact tables: 5 fact tables that receive new data daily.
Ingest rate: ~100,000 new rows per day (total across facts).
Historical size: tables contain data from July → each table ≈ 50 million rows.
Semantic models: using Direct Lake mode.
Dashboards: slicer/filter targets the most recent 1 month of data.
Capacity: currently on F4.
Hi @minhnhatdanchoi,
Do your visuals run on all the data in these tables? Are you able to limit what you bring into the report to just the recent records?
If you limit the data at ingest time rather than using a filter, the VertiPaq engine won't need to try and process 50 million records every interaction.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
I need to know how can I limit the data at the ingest time rather than using a filter. I'm kind of new to PowerBI 😅
Hi @minhnhatdanchoi,
What storage mode are you using? Import, DirectLake, or DIrectQuery?
If you're using Import mode, then you can filter the data in Power Query in the "Transform Data" window.
If you're using DIrectLake or DirectQuery it gets more complicated.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Yeah I'm using DirectLake. I just receive information that user also want to filter date range outside 30 last day so this step is not necessary now, but I still want to know how to do it with DirectLake mode
Hi @minhnhatdanchoi,
With DirectLake mode, you will need to ensure your tables only have 30 days of data in them. In your ETL that loads data into your lakehouse/warehouse, add a step that takes the last 30 days of data and inserts it into a new table, and then source your report on that new table.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Your semantic model and visual queries are trying to use ~8.5 GB RAM, but F4 capacity allows ~3 GB per dataset/operation.
This typically happens with Direct Lake mode, especially when:
The model is not optimized (wide tables, large cardinality columns).
Visuals trigger complex or large DAX queries.
Multiple visuals load at once with large context filters
Hi @minhnhatdanchoi,
Monitor memory usage using the Capacity Metrics App. If your semantic model needs more RAM, you may need to upgrade your capacity. To enhance performance, try reducing the dataset size and use model optimization methods like aggregations or DirectQuery for large tables. Power BI does not show memory usage for individual visuals or measures.
Thank you.
My semantic model already used Direct Lake, my table contains 5 millions rows
Hi @minhnhatdanchoi,
Key Best Practice
- Consider incremental refresh with Direct Lake to keep only recent partitions active.
I'm curious about this, how can I set-up this to my model ?
Hi @minhnhatdanchoi ,
Consider the GYC video as starter: Reframe your Power BI Direct Lake Semantic Model
and check the post as well
Solved: Re: How to do incremental refresh using datalake o... - Microsoft Fabric Community
If this response was helpful in any way, I’d gladly accept a 👍much like the joy of seeing a DAX measure work first time without needing another FILTER.
Please mark it as the correct solution. It helps other community members find their way faster (and saves them from another endless loop 🌀.
Hi @minhnhatdanchoi,
here is an Documentation about Direct Lake.
Read this to understand Direct Lake.
https://learn.microsoft.com/en-us/fabric/fundamentals/direct-lake-overview
The classic incremental refresh, as available in import mode, is no longer available in Direct Lake.
The documentation explains how it now works. It works differently in Direct Lake.
Best regards