Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
#QuestionForGroup
Hi All.
I have a dataset of 2.9M rows which upon unpivoting and merging with 1 table becomes some 246M rows. After that I have to remove duplicates from the table(based on Col1, Col2, Col3…. Col7). After this step if I close and apply the dataset loads for 3/4 hours and then error pops as “buffer memory out of space”.
Could anyone please help me out on this?
Possible solutions tried till now:
1. Taken a dataset of 75000 rows and did all steps and checked all working or not. After getting satisfactory results I uploaded the dashboard in power bi service and renamed the file of 2.9M rows same as file name of 75000 rows and refreshed it in power bi service. —- Result: This process didn’t work as it is not taking 2.9M rows.
2. Tried joining tables but this doesn’t work out as well. (Many to one join but corresponding values are unique.) Hence count is not matching
Note: ALL UNNECESSARY COLUMNS ARE REMOVED
Solved! Go to Solution.
Hi @Ayushman1997,
Thank you for sharing your update. We understand that you are encountering a memory issue due to the large number of records in Power BI. Since you have already escalated this to the relevant teams, we appreciate your patience as they investigate further.
In the meantime, you might consider optimizing your dataset by:
Also, please go through the below solved solution for better understanding:
Solved: Memory run out issues in power bi desktop - Microsoft Fabric Community
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Best Regards
Hi @Ayushman1997 ,
Here’s a breakdown of what usually works best in real-world scenarios:
1. Always filter and trim before unpivoting or merging. Don’t wait until after unpivot to remove unnecessary rows/columns do it as the very first step. If you can, filter at the SQL/source level before Power BI even touches the data.
2. Unpivot only the columns you actually need. The fewer columns you unpivot, the less Power Query has to process and the less memory is used.
3. When merging, always buffer the smaller table. Use Table.Buffer() on the smaller table before the merge. Don’t buffer massive tables, it can actually make things worse.
4. Remove duplicates as early as possible. If you need to deduplicate, do it before merging. If your source supports it (SQL, Python, etc.), do the deduplication outside of Power BI.
5. Use DirectQuery or Dataflows for really large data. If your source is a SQL DB, see if DirectQuery is an option (it avoids loading everything into memory). Or use a Power BI Dataflow if your org allows, it’s more scalable than desktop Power Query for big crunches.
6. Disable load for intermediate queries in Power Query. Right-click on queries that are just steps, and select “Enable Load” OFF this can save a ton of RAM.
7. Try splitting your transformation into stages. If possible, pre-process your data into smaller, “staged” files or tables. Sometimes breaking one huge import into smaller parts solves the problem.
8. Consider incremental refresh or partitioning (Premium/Fabric only). If your report will keep growing, look at incremental refresh to load in chunks.
Example M code (pattern):
// Filter at the very start
Filtered = Table.SelectRows(Source, each [Col1] <> null and [Col2] <> null),
// Unpivot only needed columns
Unpivoted = Table.UnpivotOtherColumns(Filtered, {"Col1", "Col2"}, "Attribute", "Value"),
// Buffer only the small table before merge
SmallTableBuffered = Table.Buffer(OtherSmallTable),
// Merge
Merged = Table.NestedJoin(Unpivoted, "Key", SmallTableBuffered, "Key", "NewData", JoinKind.LeftOuter),
// Expand & deduplicate
Expanded = Table.ExpandTableColumn(Merged, "NewData", {"NewCol"}),
Deduped = Table.Distinct(Expanded)
in
Deduped
Bonus Tips: Close all other heavy apps (Chrome, Teams, etc.) when you’re applying steps. If you have access to a machine with more RAM, try running Power BI Desktop there. Sometimes, it’s honestly faster to do some “pre-crunching” in Python, R, or even Excel, and import the cleaned file into Power BI.
i have checked in ETL method and resultant ouput takes up 70gb space which on exporting as csv takes 22gb. Now when I load 22gb directly it takes 3 hours
Hi there!
- Instead of unpivoting first, try to filter unnecessary columns and aggregate before unpivoting, you can use Remove Columns in Power Query for this.
- You could also try using incremental refresh. You can enable Incremental Refresh in Power BI Service and then filter data dynamically in Power Query using this formula: Table.SelectRows(Source, each [Date] >= Date.AddMonths(Today(), -12)). Don't forget to set the partitioning so that Power BI loads data in chunks instead of one large load.
Hope this helps!
😁😁
All unnecessary columns are deleted before unpivoting. The data is of 1 year and it does contain 2.9M rows minimum 😅. Also incremental or not they want full data of 1 year to be shown so no point in adding incremental refresh I guess as with schedule refresh it should be fine.
Hi @Ayushman1997,
Thank you for reaching out to the Microsoft Fabric Community Forum.
After thoroughly reviewing the details you provided, here are a few workarounds that may help resolve the issue. Please follow these steps:
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Best Regards.
Hi @Ayushman1997,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @Ayushman1997,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
The issue is not solved yet. Its not working even if after utilizing the comments
Hi @Ayushman1997,
After thoroughly reviewing the details you provided, I have identified alternative workarounds that may help resolve the issue. Please follow these steps:
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Best Regards.
Hi @Ayushman1997,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @Ayushman1997,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hey hi. No the solution is not working. No other solution is working till now and I have escalated it to Power bi/Microsoft authors. Once I get a solution will post it. As of now the memory issue is due to high number of records
Hi @Ayushman1997,
Thank you for sharing your update. We understand that you are encountering a memory issue due to the large number of records in Power BI. Since you have already escalated this to the relevant teams, we appreciate your patience as they investigate further.
In the meantime, you might consider optimizing your dataset by:
Also, please go through the below solved solution for better understanding:
Solved: Memory run out issues in power bi desktop - Microsoft Fabric Community
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Best Regards
Hi @Ayushman1997,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @Ayushman1997,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @Ayushman1997,
I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please Accept it as a solution and give it a 'Kudos' so others can find it easily.
Thank you.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 56 | |
| 56 | |
| 35 | |
| 18 | |
| 14 |