Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hi,
We are in the process of creating a report out of a table that contains 70M records. Our report query would pull around 60 M of data from this table.
Out Database source is Postgres. We have hosted our Power BI in a windows virtual machine. The RAM is 10 GB and 2 CPU processors are in place in this server.
The issue is when we try pulling the data from the DB - View which contains around 60M data, it shows memory allocation error.
I have tried reducing the dataset to 25 M also. Still same memory allocation error. But when I load the same data from a CSV file, the data gets imported to Power Bi.
Can anyone please suggest a way how can I improve the performance of Power BI so that I can pull data directly from DB without using file as a source.
Are you using the 32bit version of Power BI desktop? That would cause this, and there is no fix, short of moving to the 64 bit version. The 32bit app is limited to 2GB of RAM, and 70 million records would probably break that barrier depending on the number of fields, other tables, etc.
If you are using the 64bit version, I've seen memory errors and crashes if the data has non-printing ASCII characters, which you can strip off using a CLEAN transformation in Power Query before it gets imported into the data model.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingJoin the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Ask questions in Eventhouse and KQL, Eventstream, and Reflex.
User | Count |
---|---|
89 | |
79 | |
65 | |
64 | |
58 |
User | Count |
---|---|
171 | |
115 | |
109 | |
74 | |
69 |