The ultimate Microsoft Fabric, Power BI, Azure AI, and SQL learning event: Join us in Stockholm, September 24-27, 2024.
Save €200 with code MSCUST on top of early bird pricing!
Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi everyone!
I have a Power BI Dashboard that gets most part of the data from a Sql Data Server. There is a lot of data imported, so usually the Power BI takes 5h to refresh all data on the app.
However, when I succesfully upload it to the Power BI workspace and try to open it, it returns the following error:
"Database '--' exceeds the maximum size limit on disk; the size of the database to be loaded or committed is 131819852 bytes, and the valid size limit is 53477376 bytes."
The Power BI file itself does not exceed 50mb (40.3mb). In addition, I am using the "Import" option on the query, instead of "Direct Query". I'm using a Premium Power BI Workspace, however its limit is just 50mb.
The question is, how do I solve the size limit problem? Is there any way to improve the data refresh, maybe changing to "Direct Query"?
Solved! Go to Solution.
Hello @LeoMandro ,
try to optimise your report.
check this article https://www.linkedin.com/pulse/power-bi-optimization-framework-ahmad-chamy-nensf/
Proud to be a Super User! | |
Hi @LeoMandro ,
First of all, many thanks to @Idrissshatila for your very quick and effective replies.
You could try below steps:
1. Consider Using DirectQuery: Since you're currently using the "Import" option, switching to "DirectQuery" could be a viable solution. DirectQuery doesn't import data into Power BI; instead, it queries the underlying data source in real-time. This means the size of your dataset on disk won't be a limiting factor.
2. Optimize Your Data Model: If sticking with the "Import" option, consider optimizing your data model to reduce its size. This could involve removing unnecessary columns, aggregating data at a higher level, or using incremental refresh to only load data that has changed.
3. Incremental Refresh: For Premium workspaces, you have the option to use Incremental Refresh, which allows you to refresh only the data that has changed, rather than the entire dataset. This can significantly reduce the amount of data processed and stored. To learn more about setting up Incremental Refresh, visit Incremental refresh in Power BI.
Best regards,
Community Support Team_Binbin Yu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @LeoMandro ,
First of all, many thanks to @Idrissshatila for your very quick and effective replies.
You could try below steps:
1. Consider Using DirectQuery: Since you're currently using the "Import" option, switching to "DirectQuery" could be a viable solution. DirectQuery doesn't import data into Power BI; instead, it queries the underlying data source in real-time. This means the size of your dataset on disk won't be a limiting factor.
2. Optimize Your Data Model: If sticking with the "Import" option, consider optimizing your data model to reduce its size. This could involve removing unnecessary columns, aggregating data at a higher level, or using incremental refresh to only load data that has changed.
3. Incremental Refresh: For Premium workspaces, you have the option to use Incremental Refresh, which allows you to refresh only the data that has changed, rather than the entire dataset. This can significantly reduce the amount of data processed and stored. To learn more about setting up Incremental Refresh, visit Incremental refresh in Power BI.
Best regards,
Community Support Team_Binbin Yu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hello!
Thanks for the help. I did some changes on the SQL code and make it a little bit faster. Unfortunally I can't remove any data from it, as it is used in the Dashboard.
Hello @LeoMandro ,
try to optimise your report.
check this article https://www.linkedin.com/pulse/power-bi-optimization-framework-ahmad-chamy-nensf/
Proud to be a Super User! | |
Full error message:
Underlying Error PowerBI service client received error HTTP response. HttpStatus: 503. PowerBIErrorCode: OpenConnectionError
OpenConnectionError Database '9b6abbcf-76d8-45ae-87bd-8e3084dba66f' exceeds the maximum size limit on disk; the size of the database to be loaded or committed is 131819852 bytes, and the valid size limit is 53477376 bytes. If using Power BI Premium, the maximum DB size is defined by the customer SKU size (hard limit) and the max dataset size from the Capacity Settings page in the Power BI Portal.
Correlation ID d0e4cce0-6e22-58cd-d1c0-6f25947dd976
Activity ID 03110582-8a6c-42a4-9432-4e36a98319f1
Request ID c61120f2-938f-4dc5-a9b8-3c775c388561
Time Wed May 08 2024 09:14:33 GMT+0200 (Central European Summer Time)
Service version 13.0.23216.54
Client version 2404.5.19055-train
Cluster URI https://0ae51e1907c84e4bbb6d648ee58410f4-api.analysis.windows.net/
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Check out the August 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
52 | |
22 | |
11 | |
11 | |
9 |
User | Count |
---|---|
112 | |
32 | |
30 | |
20 | |
19 |