Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
ganaa0810
Regular Visitor

Large datasets

Hello everyone, Im new on powerbi using pro license and dealing with large dataset for me. 
I got error after publishing to service because of initial huge data load.  I configured incremental refresh on 8 tablesh which biggest fact tables. But there are some changes need to make in model change so i did it and publish its getting connection timeout error.
It was working untill we got our historic datas which means i only had one year of data and now we have 2 years of data.
How to solve this guys help me please.

1 ACCEPTED SOLUTION
v-sgandrathi
Community Support
Community Support

Hi @ganaa0810,

 

The main challenge isn't the size of your PBIX file on desktop, but rather the full refresh that occurs after every model change in the Power BI Service. Even though your PBIX is only 230 MB locally, publishing a model with incremental refresh enabled causes the Service to rebuild the entire dataset once, including all historical data. This process can exceed Pro capacity limits or result in connection timeouts since the Service tries to load two years of data at once. This is expected behavior for Pro: after any model change, incremental refresh can't reuse previous partitions and must reprocess everything. Premium/PPU offers more capacity, faster refreshes, and advanced features like loading data without reprocessing all history. Your approach is correct, but your model has reached a size where Pro struggles during the initial full refresh. On Power BI Pro, you can temporarily reduce the refresh window (such as loading only 6 months), publish, and then expand the window. For a more stable solution, consider moving to PPU or Premium for better handling of large historical data loads.

 

Thank you.

View solution in original post

8 REPLIES 8
v-sgandrathi
Community Support
Community Support

Hi @ganaa0810,

 

The main challenge isn't the size of your PBIX file on desktop, but rather the full refresh that occurs after every model change in the Power BI Service. Even though your PBIX is only 230 MB locally, publishing a model with incremental refresh enabled causes the Service to rebuild the entire dataset once, including all historical data. This process can exceed Pro capacity limits or result in connection timeouts since the Service tries to load two years of data at once. This is expected behavior for Pro: after any model change, incremental refresh can't reuse previous partitions and must reprocess everything. Premium/PPU offers more capacity, faster refreshes, and advanced features like loading data without reprocessing all history. Your approach is correct, but your model has reached a size where Pro struggles during the initial full refresh. On Power BI Pro, you can temporarily reduce the refresh window (such as loading only 6 months), publish, and then expand the window. For a more stable solution, consider moving to PPU or Premium for better handling of large historical data loads.

 

Thank you.

Thank you

 

TomMartens
Super User
Super User

Hey @ganaa0810 ,

 

without knowing the size of the semantic model in Power BI Desktop (meaning locally), it's not possible to say if you are hitting the Pro size limit of 1 GB.

Incremental refresh does not help you overcome this limit; it only loads data faster by adding chunks of data (the increment) to an existing model, but the total size cannot exceed the size limit set by the licensing.

For Power BI Pro this limit is 1 GB for a semantic model in the Power BI Service.

Regards,
Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

.pbix size it is just 230mb tho on my local... The problem is occuring after publishing the service. 
after publish its gonna replace my all things and reload the data + historic data means its loading everything  what i think is thats the culprit i think.

 

 

Poojara_D12
Super User
Super User

Hi @ganaa0810 

Your model worked when the data was small, but the moment two years of history came in, the whole setup got too heavy for a Pro workspace. When you changed the model and republished, Power BI had to run a full refresh again, not an incremental one, and that full load is now too big—so it times out. The only way out is to trim the model, make the queries more efficient, be sure incremental refresh is actually folding, and push heavy work back to the source. If the data keeps growing, a Pro license will struggle no matter what, and you may eventually need PPU or a proper data warehouse feeding Power BI.

 

Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"

Kind Regards,
Poojara - Proud to be a Super User
Data Analyst | MSBI Developer | Power BI Consultant
Consider Subscribing my YouTube for Beginners/Advance Concepts: https://youtube.com/@biconcepts?si=04iw9SYI2HN80HKS

thanks for your answer.
First of all all incremental refresh is working fine.  When the historic data came and its gotten worse. 
we do have warehouse and im actually  getting a pre-agergatted datas  so it cant be lower anymore if its lowered from this the stakeholders cant see what they want. (Its because they want it by year comparision, month comparision and even the day of months thats why the the data would be large btw its retail company)
Im wondering when change modeling or add table if needed how to not reload all datas only model architecture change. Does it only available in Premium or Can be done in pro ? 
Im confused am i doing it wrong or is because of the license.

TomMartens
Super User
Super User

Hey @ganaa0810 ,

 

how large is the semantic model when you open Power BI Desktop?

Because you mentioned that you are on Power BI Pro, the model cannot exceed 1 GB, whether using incremental refresh or a full refresh.
Open the PBIX using Power BI Desktop, and check the size using DAX Studio, use the Vertipaq Analyzer inside DAX Studio:
image.png
The sum of the column size across all tables of the semantic model must not be larger 1GB (keep in mind that size is measured in Byte )

Here you can download DAX Studio: https://daxstudio.org/downloads/
Use the portable version in case you are not allowed to install anything on your machine.

 

You will also find Documentation on the site that will get you started.

Regards,
Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany
grazitti_sapna
Super User
Super User

Hi @ganaa0810 

 

You can try the following and let us know if the issue still persist:

  • Double-checked incremental refresh policy (RangeStart/RangeEnd on each big table).
  • Reduced columns and pre-aggregated facts.
  • Temporarily limited refresh to latest 6 months for first publish.
  • Increased source DB timeout settings.
  • Monitored refresh history after publish for slow tables.

Full model refresh after schema change with large historic data can trigger timeouts breaking initial load into smaller windows and publishing from Desktop first is key.

 

🌟 I hope this solution helps you unlock your Power BI potential! If you found it helpful, click 'Mark as Solution' to guide others toward the answers they need.

💡 Love the effort? Drop the kudos! Your appreciation fuels community spirit and innovation.

🎖 As a proud SuperUser and Microsoft Partner, we’re here to empower your data journey and the Power BI Community at large.

🔗 Curious to explore more? [Discover here].

Let’s keep building smarter solutions together!

 

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors