Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreJoin the FabCon + SQLCon recap series. Up next: Power BI, Real-Time Intelligence, IQ and AI, and Data Factory take center stage. All sessions are available on-demand after the live show. Register now
I am facing the memory allocation issue in Power as i am using the Premium per user in Power bi and i am importing around 26 million rows and files goes to below 500 mb even and still i gives me error.
If anyone can help me out in this issue please help.
Hi @RehanAmjad_ ,
Normally, it deserves to succeed. You can provide detailed error messages. It would be better to troubleshoot the problem.
Best regards,
Community Support Team_ Scott Chang
Even if your dataset is under 500 MB, handling 26 million rows can be quite intensive. So first question to ask, do you really need all that data ? Why don't you aggregate it ?
So, you can also reduce the number of columns, filtering out unnecessary rows, and using data aggregations or summaries where possible. Verify also the data types as inappropriate data types can consume more memory.
2 things in my mind : split your data into smaller chunks or use incremental refresh.
Check out the April 2026 Power BI update to learn about new features.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
| User | Count |
|---|---|
| 48 | |
| 46 | |
| 41 | |
| 20 | |
| 17 |
| User | Count |
|---|---|
| 69 | |
| 67 | |
| 32 | |
| 27 | |
| 26 |