Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
The main fact table in my PBI report has doubled in size (from ~2GB to ~4GB). It's now about 60,000,000 rows. When I try to refresh this table I get the error message:
Failed to save modifications to the server. Error returned: 'There's not enough memory to complete this operation. Please try again later when there may be more memory available. '.
It's by far the biggest table in the report, the next biggest table is ~50MB. It's being imported from SQL Server. There was never an issue with loading this table before.
I've tried the following but they didn't make any difference:
- Increasing the "Maximum memory used per simultaneous evaluation (MB)" to 1206MB
- Increasing the "Maximum allowed (MB)" for data cache management to 16GB
- Unticking the auto date/time option
- Changing the query-limit simulation from auto to "Premium capacity" (I have a Pro licence but just wanted to test if this would make a difference).
Besides reducing the table size, is there anything else I could try? My report has lots of measures and calculated columns (mainly DAX rather than Power Query added columns) - do I need to try making them more efficient? I'm sure there's probably savings to be made there but I'd rather a simpler fix in the first instance.
Solved! Go to Solution.
Hi @am_nr, Apologies for missing you last comment.
It’s possible the issue is coming from the SQL Server side. When Power BI refreshes, it pushes queries back to the database, so if SQL Server is already under memory pressure that can cause the refresh to fail. I’d recommend testing the same query in SSMS or to check for indexing or resource bottlenecks on the database.
thanks,
Prashanth
You can easily use Measure Killer to remove all unused column, this should give you some breathing room.
Hey, thanks for the suggestion. I don't think I need it in this case, but good to know such a tool exists!
Hi @am_nr,
did this issue got resolved or you still need any help? please do let us know if still need assistance. we are happy to help.
Thanks,
Prashanth
Please see my reply to your last comment on 18/08.
Hi @am_nr, Apologies for missing you last comment.
It’s possible the issue is coming from the SQL Server side. When Power BI refreshes, it pushes queries back to the database, so if SQL Server is already under memory pressure that can cause the refresh to fail. I’d recommend testing the same query in SSMS or to check for indexing or resource bottlenecks on the database.
thanks,
Prashanth
Hi @v-prasare, thanks for confirming. Ultimately, I think this was the issue as I was getting the same problem just refreshing with the smaller (2GB) table. When the logs were cleared for the source database then the problem stopped and the PBI data could be refreshed again. I've accepted your comment as the solution, thank you.
Hi @am_nr,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
Best Regards,
Prashanth Are
MS Fabric communty support
Hi @v-prasare, I haven't been able to resolve my memory issue yet but I'm beginning to think it's an issue with the database from which I read in the data, rather than PBI itself. Could this be possible? The database itself is having memory issues, which complicates matters.
This is very common issue these days. I would try below
1. Remove unused columns from Dataset
2. Try to agrregate table/query so that the count gets reduced
3. Try to reduce the date range if possible
Hope this helps... I will try to keep PBIX size<= 1 GB
Hi @srlabhe, thanks for your reply. I'd like to avoid aggregating my data before putting it into PBI, but I will try find some efficiencies elsewhere.
Hi @am_nr
That error isn’t fixed by the “maximum MB” options.They don’t give VertiPaq more RAM. It means your model got too big to process in memory after the fact table doubled. Shrink the model: remove unused columns, move calculated columns to Power Query/SQL, keep only numeric keys in the fact (move long text to dimensions), reduce precision/high-cardinality fields (e.g., round decimals, split DateTime), and keep Auto date/time off. If you must keep all 60M rows, use Incremental Refresh and/or switch the big fact to DirectQuery/aggregations so only recent/summary data is imported. Also note: with a Pro license the published dataset must be ≤1 GB (compressed),a multi-GB model needs PPU/Premium or further reduction. For Desktop refresh, ensure roughly 2–3× the model size is available as free RAM.
Hi @rohit1991 , thanks for your reply. I will try some of these suggestions! Fingers crossed it's enough.