Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
YacinO
Regular Visitor

Dataset isnt uploading : bug or wait ?

Hello everyone,

thanks for reading me ! 

 

Im uploading on an external tenant a 9.6 Go dataset. Few days before i already upload a 12Go dataset and everything went well. But today when I try to upload this dataset, the loading is starting and then the message disappear from the screen and its like nothing happen, no error message or validation message saying that everything went well (also no "The data is importing message"). The workspace is a Premium one and i already upload a 12Go dataset, everything worked. Is there a hidden limitation that im not aware of or something like this ?

 

Thanks for your help ! 

 

1 ACCEPTED SOLUTION
lbendlin
Super User
Super User

There is a 10GB limit per partition.  Most likely your 12 GB semantic model was compressed to fall under the limit, or your partitions were smaller.

 

The standard approach for the initial load of overly large semantic models is partition bootstrapping.

"For very large models in Premium capacities that likely contain billions of rows, the initial refresh operation can be bootstrapped. Bootstrapping allows the service to create table and partition objects for the model, but doesn't load and process data into any of the partitions. By using SQL Server Management Studio, you can set partitions to be processed individually, sequentially, or in parallel, to both reduce the amount of data returned in a single query, and also bypass the five-hour time limit. To learn more, see Advanced incremental refresh - Prevent timeouts on initial full refresh."

 

View solution in original post

6 REPLIES 6
v-sathmakuri
Community Support
Community Support

Hi @YacinO ,

 

I hope this information provided is helpful. Feel free to reach out if you have any further questions or would like to discuss this in more detail. If responses provided answers your question, please accept it as a solution so other community members with similar problems can find a solution faster.

 

Thank you!!

v-sathmakuri
Community Support
Community Support

Hi @YacinO ,

 

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If the responses has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.

 

Thank you!!

v-sathmakuri
Community Support
Community Support

Hi @YacinO ,

 

Thank you for reaching out to Microsoft Fabric Community.

 

Thank you @lbendlin  for the prompt response.

 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

 

Thank you.

lbendlin
Super User
Super User

There is a 10GB limit per partition.  Most likely your 12 GB semantic model was compressed to fall under the limit, or your partitions were smaller.

 

The standard approach for the initial load of overly large semantic models is partition bootstrapping.

"For very large models in Premium capacities that likely contain billions of rows, the initial refresh operation can be bootstrapped. Bootstrapping allows the service to create table and partition objects for the model, but doesn't load and process data into any of the partitions. By using SQL Server Management Studio, you can set partitions to be processed individually, sequentially, or in parallel, to both reduce the amount of data returned in a single query, and also bypass the five-hour time limit. To learn more, see Advanced incremental refresh - Prevent timeouts on initial full refresh."

 

Hello @lbendlin , thanks for your answer ! 

 

I don't know if there is a correlation but i try to send the file to my client to try to open it and publish it directly from his power bi Desktop. He has an error message saying : "File name" is damaged and cannot be open by Power BI Desktop. Is this issue correlated to the fact that i cant publish it ?

 

how do you send that file to you client? it's likely too big for email

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors