Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hello everyone,
thanks for reading me !
Im uploading on an external tenant a 9.6 Go dataset. Few days before i already upload a 12Go dataset and everything went well. But today when I try to upload this dataset, the loading is starting and then the message disappear from the screen and its like nothing happen, no error message or validation message saying that everything went well (also no "The data is importing message"). The workspace is a Premium one and i already upload a 12Go dataset, everything worked. Is there a hidden limitation that im not aware of or something like this ?
Thanks for your help !
Solved! Go to Solution.
There is a 10GB limit per partition. Most likely your 12 GB semantic model was compressed to fall under the limit, or your partitions were smaller.
The standard approach for the initial load of overly large semantic models is partition bootstrapping.
"For very large models in Premium capacities that likely contain billions of rows, the initial refresh operation can be bootstrapped. Bootstrapping allows the service to create table and partition objects for the model, but doesn't load and process data into any of the partitions. By using SQL Server Management Studio, you can set partitions to be processed individually, sequentially, or in parallel, to both reduce the amount of data returned in a single query, and also bypass the five-hour time limit. To learn more, see Advanced incremental refresh - Prevent timeouts on initial full refresh."
Hi @YacinO ,
I hope this information provided is helpful. Feel free to reach out if you have any further questions or would like to discuss this in more detail. If responses provided answers your question, please accept it as a solution so other community members with similar problems can find a solution faster.
Thank you!!
Hi @YacinO ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If the responses has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you!!
Hi @YacinO ,
Thank you for reaching out to Microsoft Fabric Community.
Thank you @lbendlin for the prompt response.
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
There is a 10GB limit per partition. Most likely your 12 GB semantic model was compressed to fall under the limit, or your partitions were smaller.
The standard approach for the initial load of overly large semantic models is partition bootstrapping.
"For very large models in Premium capacities that likely contain billions of rows, the initial refresh operation can be bootstrapped. Bootstrapping allows the service to create table and partition objects for the model, but doesn't load and process data into any of the partitions. By using SQL Server Management Studio, you can set partitions to be processed individually, sequentially, or in parallel, to both reduce the amount of data returned in a single query, and also bypass the five-hour time limit. To learn more, see Advanced incremental refresh - Prevent timeouts on initial full refresh."
Hello @lbendlin , thanks for your answer !
I don't know if there is a correlation but i try to send the file to my client to try to open it and publish it directly from his power bi Desktop. He has an error message saying : "File name" is damaged and cannot be open by Power BI Desktop. Is this issue correlated to the fact that i cant publish it ?
how do you send that file to you client? it's likely too big for email
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.