Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreJoin the FabCon + SQLCon recap series. Up next: Power BI, Real-Time Intelligence, IQ and AI, and Data Factory take center stage. All sessions are available on-demand after the live show. Register now
I'm trying to import a fact table with ~45 million rows, and the import chokes at around 5GB (45 minutes).
I've already trimmed down columns, avoided text strings, aggregated, etc. to help the load. I know Power BI is supposed to handle billions of records, but I haven't been able to get anywhere close to that amount. We are trying to avoid going down the tabular model route if possible, but this limitation is causing us to rethink our architecture.
Any ideas on the disconnect?
Solved! Go to Solution.
Hi Nathan,
I wonder if you could benefit by using DirectQuery instead of loading all of your data. Note that you design your model to be Dual mode. Some tables to be in-memory(load all data) and others using DirectQuery.
Please read the following regarding dataset sizes:
https://docs.microsoft.com/en-us/power-bi/service-premium-what-is
Depending on the SKU, Power BI Premium supports uploading Power BI Desktop (.pbix) model files up to a maximum of 10 GB in size. When loaded, the model can then be published to a workspace assigned to a Premium capacity. The dataset can then be refreshed to up to 12 GB in size.
Large datasets can be resource-intensive. You should have at least a P1 SKU for any datasets larger than 1 GB. Although publishing large datasets to workspaces backed by A SKUs up to A3 could work, refreshing them will not.
The following table shows the recommended SKUs for .pbix file upload or publish to the Power BI service:
| P1 | < 3 GB |
| P2 | < 6 GB |
| P3, P4, P5 | up to 10 GB |
let me know if this helps or not.
Tomas
Hi Nathan,
I wonder if you could benefit by using DirectQuery instead of loading all of your data. Note that you design your model to be Dual mode. Some tables to be in-memory(load all data) and others using DirectQuery.
Please read the following regarding dataset sizes:
https://docs.microsoft.com/en-us/power-bi/service-premium-what-is
Depending on the SKU, Power BI Premium supports uploading Power BI Desktop (.pbix) model files up to a maximum of 10 GB in size. When loaded, the model can then be published to a workspace assigned to a Premium capacity. The dataset can then be refreshed to up to 12 GB in size.
Large datasets can be resource-intensive. You should have at least a P1 SKU for any datasets larger than 1 GB. Although publishing large datasets to workspaces backed by A SKUs up to A3 could work, refreshing them will not.
The following table shows the recommended SKUs for .pbix file upload or publish to the Power BI service:
| P1 | < 3 GB |
| P2 | < 6 GB |
| P3, P4, P5 | up to 10 GB |
let me know if this helps or not.
Tomas
Check out the April 2026 Power BI update to learn about new features.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
| User | Count |
|---|---|
| 48 | |
| 46 | |
| 41 | |
| 20 | |
| 17 |
| User | Count |
|---|---|
| 70 | |
| 69 | |
| 32 | |
| 27 | |
| 26 |