Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Suriya_1
New Member

Handling 8M+ Rows in Power BI Pro – Import Model Size Limitations and DirectQuery Timeout Issues

Hello,

We are currently using Power BI Pro and working on onboarding reports that require semantic models to load 8 million+ records from our data source. Multiple reports will be created on top of these models, and each report applies different filters based on business requirements.

During development, we are running into two major issues:
1. Import Mode – Model Size Limit / Timeouts During Load

We understand that Power BI Pro supports 10 GB shared capacity, but each individual dataset is limited to 1 GB (compressed).
While loading our model (already close to this limit), we encounter:

  • Timeout errors during data load

  • Very high memory consumption when applying certain filters (for example, a single field filter uses ~500 MB)

We considered creating multiple workspaces, each hosting a separate semantic model, thinking this might help utilize the shared capacity. However, we are unsure whether this approach actually increases dataset capacity, or if the 1 GB dataset limit applies regardless of the workspace.

Question:
Is it possible to increase dataset/model size under Power BI Pro, or is the 1 GB limit fixed per dataset across all workspaces? Would using multiple workspaces help in handling larger semantic models?

2. DirectQuery – Timeout Issues When Querying the Source

We attempted using DirectQuery, but it also resulted in:

  • Query timeout errors after running for couple of hours

Looking for Guidance

We would appreciate recommendations on:

  • Whether the idea of using multiple workspaces to host separate models,increasing the memory size is viable

  • How to reliably load and query datasets of 8M+ rows under Power BI Pro

  • Whether we should consider:

    • Incremental refresh
      Keeping in mind that incremental refresh also requires the full dataset to load successfully at least once, which we are currently unable to achieve due to timeouts

    • Upgrading to PPU or Premium,

  • Best practices to avoid load failures and DirectQuery timeout issues at this data volume

    We would appreciate timely guidance on this, as it is a key priority.

10 REPLIES 10
v-echaithra
Community Support
Community Support

HI @Suriya_1 ,

We’d like to follow up regarding the recent concern. Kindly confirm whether the issue has been resolved, or if further assistance is still required. We are available to support you and are committed to helping you reach a resolution.

Best Regards,
Chaithra E.

v-echaithra
Community Support
Community Support

Hi @Suriya_1 ,

May I ask if you have resolved this issue? Please let us know if you have any further issues, we are happy to help.

Thank you.

sergej_og
Super User
Super User

Hey @Suriya_1 ,
can you pls tell more about the column which is taking approx. 500MB?
Is this a key-column? Meaning, do you need this one column for specific operations on the model?
There are a few existing techniques to reduce the model size, like disable the "Available In MDX" option (reducing the dictionary size - available by using Tabular Editor). Be a bit careful with that when using the data in Excel.

It seems you have many columns inside - most likely some of them are really heavy weighted.
8M rows is not that much, even for a Pro license.

Try to reduce the model size with known techniques.
Take a look on that...
https://data-mozart.com/how-to-reduce-your-power-bi-model-size-by-90/

Regards

Thanks @sergej_og !


We are trying to follow best practices to reduce the size. Yes, it is a key column which is a transaction number

Hi @Suriya_1 ,

Thank you for the update. The 1 GB compressed dataset limit in Power BI Pro is a hard limit and cannot be exceeded. Creating or distributing models across multiple workspaces does not increase the maximum size available to a single dataset. Excessive memory consumption from key or high-cardinality columns is primarily a data model design challenge, not something licensing alone can resolve.
For large, transactional, and memory-intensive models, the supported and scalable solution is Premium Capacity combined with proper model optimisation.

Best Regards,
Chaithra E.

Zanqueta
Solution Sage
Solution Sage

Hi @Suriya_1,

 

1. Dataset Size Limit in Power BI Pro

  • The 1 GB limit per dataset in Power BI Pro is fixed and applies regardless of the number of workspaces. Creating multiple workspaces does not increase the capacity for a single dataset.
  • The 10 GB shared capacity refers to the total storage across all datasets in your tenant, but each dataset cannot exceed 1 GB compressed in Import Mode.
Implication:
If your semantic model approaches or exceeds 1 GB compressed, you will not be able to load it successfully in Power BI Pro.

2. DirectQuery Timeout Issues

  • DirectQuery relies on the performance of the underlying data source. Large queries (such as those scanning millions of rows) can easily hit timeouts.
  • Power BI Service has default query timeouts (typically 225 seconds for visuals). Long-running queries will fail unless optimised at the source.

Recommended Approaches

Option A: Optimise Import Mode

  • Reduce model size:
    • Remove unnecessary columns and tables.
    • Use aggregation tables for summarised data.
    • Apply data type optimisation (e.g., use integers instead of strings where possible).
  • Incremental Refresh:
    • This is highly recommended for large datasets.
    • It allows you to refresh only new or changed data rather than the entire dataset.
    • Note: The initial full refresh must succeed once, so you may need to optimise the source or split the load temporarily.

Option B: Consider Premium or PPU

  • Power BI Premium Per User (PPU) or Premium Capacity increases dataset size limits:
    • PPU: Up to 100 GB per dataset.
    • Premium Capacity: Up to 400 GB per dataset.
  • Premium also supports larger memory allocations, longer query timeouts, and advanced features like aggregations and hybrid tables.

Option C: Improve DirectQuery Performance

  • Push filtering and aggregations to the source (SQL views or stored procedures).
  • Use composite models (Import for aggregated data + DirectQuery for detail).
  • Ensure the source database is indexed and optimised for analytical queries.

Best Practices

  • Avoid loading raw transactional data into Power BI.
  • Implement star schema modelling for efficiency.
  • Use aggregations and hybrid tables for large datasets.
  • Monitor refresh performance and consider partitioning strategies.

 

Official References:

What is Power BI Premium? - Microsoft Fabric | Microsoft Learn

Configure incremental refresh and real-time data for Power BI semantic models - Power BI | Microsoft...

DirectQuery in Power BI: When to Use, Limitations, Alternatives - Power BI | Microsoft Learn

 

If this response was helpful in any way, I’d gladly accept a 👍much like the joy of seeing a DAX measure work first time without needing another FILTER.

Please mark it as the correct solution. It helps other community members find their way faster (and saves them from another endless loop 🌀.

Hi @Zanqueta 

Thank you for your quick response.

Quick question regarding Option B of the recommended approach (Premium or PPU).

You mentioned that upgrading to Power BI Premium Per User (PPU) or Premium Capacity increases dataset size limits:

  • PPU: Up to 100 GB per dataset

  • Premium Capacity: Up to 400 GB per dataset

These options also provide larger memory allocations, longer query timeouts, and advanced features such as aggregations and hybrid tables.

Our question is about the practical implications when choosing between PPU (100 GB limit) and Premium Capacity (400 GB limit). Since our semantic model pulls data from transactional tables, we noticed that even applying a single filter on one field consumes around 440 MB of memory. This raises the concern that even with a 100 GB dataset limit, the memory usage could still grow significantly depending on cardinality and filter operations.

Given this, would PPU (100 GB per dataset) be sufficient, or would Premium Capacity (400 GB per dataset) be more appropriate for our scenario?

Additionally, could you please assist with a cost comparison between PPU and Premium Capacity to help us evaluate Option B?

Hi @Suriya_1,
Refer this Microsoft official documentation for all details you need. 400GB for premium is usually with the highest capacity SKU.

 

 

Give a Thumbs Up if this post helped you in any way and Mark This Post as Solution if it solved your query !!!

Proud To Be a Super User !!!
LinkedIn

Hi @Anand24  @Zanqueta 

To my understanding, PPU costs $24 per user, provides up to 100 GB of dataset memory, and offers storage that starts around 10 TB and can grow depending on the tenant, with 100 TB being the upper limit typically supported.

Premium Per Capacity (PPC) is priced based on the capacity SKU you purchase, and it supports up to 400 GB of dataset memory (on the highest SKUs) along with 100 TB or more of storage.

Correct me if im wrong

@Suriya_1,
Seems right. You can use this microsoft licensing guide for better clarity.

 

Couple of pointers:
1. All licenses (Pro, PPU and Premium Capacity) can be cheaper if your client/organization has already purchased licenses in bulk. Microsoft does provide every license with discounted price when purchased in bulk. For example, Pro license which costs $14 sometimes goes as cheap as ~$7.6. So, ask about this as well to your Power BI platform admin team.

 

2. The type of license you take also requires a lot to do with how many users are using the dashboard. Even if you have a PPU license, free licensed users generally cannot view reports in a non-premium workspaceIf the user base is large, I'll recommend premium capacity since you just need to pay for premium capacity and some pro licenses for developers while free licensed users would be able to view the reports.

 

IMO, PPU is best when you have cost constraint and need only certain premium capabilities for person with PPU license. Premium capacity is go-to for getting all premium features and when the user base is large.

 

 

Give a Thumbs Up if this post helped you in any way and Mark This Post as Solution if it solved your query !!!

Proud To Be a Super User !!!
LinkedIn

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.