Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello,
We are currently using Power BI Pro and working on onboarding reports that require semantic models to load 8 million+ records from our data source. Multiple reports will be created on top of these models, and each report applies different filters based on business requirements.
During development, we are running into two major issues:
1. Import Mode – Model Size Limit / Timeouts During Load
We understand that Power BI Pro supports 10 GB shared capacity, but each individual dataset is limited to 1 GB (compressed).
While loading our model (already close to this limit), we encounter:
Timeout errors during data load
Very high memory consumption when applying certain filters (for example, a single field filter uses ~500 MB)
We considered creating multiple workspaces, each hosting a separate semantic model, thinking this might help utilize the shared capacity. However, we are unsure whether this approach actually increases dataset capacity, or if the 1 GB dataset limit applies regardless of the workspace.
Question:
Is it possible to increase dataset/model size under Power BI Pro, or is the 1 GB limit fixed per dataset across all workspaces? Would using multiple workspaces help in handling larger semantic models?
We attempted using DirectQuery, but it also resulted in:
Query timeout errors after running for couple of hours
We would appreciate recommendations on:
Whether the idea of using multiple workspaces to host separate models,increasing the memory size is viable
How to reliably load and query datasets of 8M+ rows under Power BI Pro
Whether we should consider:
Incremental refresh
Keeping in mind that incremental refresh also requires the full dataset to load successfully at least once, which we are currently unable to achieve due to timeouts
Upgrading to PPU or Premium,
Best practices to avoid load failures and DirectQuery timeout issues at this data volume
We would appreciate timely guidance on this, as it is a key priority.
HI @Suriya_1 ,
We’d like to follow up regarding the recent concern. Kindly confirm whether the issue has been resolved, or if further assistance is still required. We are available to support you and are committed to helping you reach a resolution.
Best Regards,
Chaithra E.
Hi @Suriya_1 ,
May I ask if you have resolved this issue? Please let us know if you have any further issues, we are happy to help.
Thank you.
Hey @Suriya_1 ,
can you pls tell more about the column which is taking approx. 500MB?
Is this a key-column? Meaning, do you need this one column for specific operations on the model?
There are a few existing techniques to reduce the model size, like disable the "Available In MDX" option (reducing the dictionary size - available by using Tabular Editor). Be a bit careful with that when using the data in Excel.
It seems you have many columns inside - most likely some of them are really heavy weighted.
8M rows is not that much, even for a Pro license.
Try to reduce the model size with known techniques.
Take a look on that...
https://data-mozart.com/how-to-reduce-your-power-bi-model-size-by-90/
Regards
Thanks @sergej_og !
We are trying to follow best practices to reduce the size. Yes, it is a key column which is a transaction number
Hi @Suriya_1 ,
Thank you for the update. The 1 GB compressed dataset limit in Power BI Pro is a hard limit and cannot be exceeded. Creating or distributing models across multiple workspaces does not increase the maximum size available to a single dataset. Excessive memory consumption from key or high-cardinality columns is primarily a data model design challenge, not something licensing alone can resolve.
For large, transactional, and memory-intensive models, the supported and scalable solution is Premium Capacity combined with proper model optimisation.
Best Regards,
Chaithra E.
Hi @Suriya_1,
What is Power BI Premium? - Microsoft Fabric | Microsoft Learn
DirectQuery in Power BI: When to Use, Limitations, Alternatives - Power BI | Microsoft Learn
If this response was helpful in any way, I’d gladly accept a 👍much like the joy of seeing a DAX measure work first time without needing another FILTER.
Please mark it as the correct solution. It helps other community members find their way faster (and saves them from another endless loop 🌀.
Hi @Zanqueta
Thank you for your quick response.
Quick question regarding Option B of the recommended approach (Premium or PPU).
You mentioned that upgrading to Power BI Premium Per User (PPU) or Premium Capacity increases dataset size limits:
PPU: Up to 100 GB per dataset
Premium Capacity: Up to 400 GB per dataset
These options also provide larger memory allocations, longer query timeouts, and advanced features such as aggregations and hybrid tables.
Our question is about the practical implications when choosing between PPU (100 GB limit) and Premium Capacity (400 GB limit). Since our semantic model pulls data from transactional tables, we noticed that even applying a single filter on one field consumes around 440 MB of memory. This raises the concern that even with a 100 GB dataset limit, the memory usage could still grow significantly depending on cardinality and filter operations.
Given this, would PPU (100 GB per dataset) be sufficient, or would Premium Capacity (400 GB per dataset) be more appropriate for our scenario?
Additionally, could you please assist with a cost comparison between PPU and Premium Capacity to help us evaluate Option B?
Hi @Suriya_1,
Refer this Microsoft official documentation for all details you need. 400GB for premium is usually with the highest capacity SKU.
Give a Thumbs Up if this post helped you in any way and Mark This Post as Solution if it solved your query !!! Proud To Be a Super User !!! |
To my understanding, PPU costs $24 per user, provides up to 100 GB of dataset memory, and offers storage that starts around 10 TB and can grow depending on the tenant, with 100 TB being the upper limit typically supported.
Premium Per Capacity (PPC) is priced based on the capacity SKU you purchase, and it supports up to 400 GB of dataset memory (on the highest SKUs) along with 100 TB or more of storage.
Correct me if im wrong
@Suriya_1,
Seems right. You can use this microsoft licensing guide for better clarity.
Couple of pointers:
1. All licenses (Pro, PPU and Premium Capacity) can be cheaper if your client/organization has already purchased licenses in bulk. Microsoft does provide every license with discounted price when purchased in bulk. For example, Pro license which costs $14 sometimes goes as cheap as ~$7.6. So, ask about this as well to your Power BI platform admin team.
2. The type of license you take also requires a lot to do with how many users are using the dashboard. Even if you have a PPU license, free licensed users generally cannot view reports in a non-premium workspace. If the user base is large, I'll recommend premium capacity since you just need to pay for premium capacity and some pro licenses for developers while free licensed users would be able to view the reports.
IMO, PPU is best when you have cost constraint and need only certain premium capabilities for person with PPU license. Premium capacity is go-to for getting all premium features and when the user base is large.
Give a Thumbs Up if this post helped you in any way and Mark This Post as Solution if it solved your query !!! Proud To Be a Super User !!! |
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 56 | |
| 55 | |
| 36 | |
| 18 | |
| 14 |