Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
I set up auto refresh on a report I recently worked on, but the next day received this error:
"The amount of data on the gateway client has exceeded the limit for a single table. Please consider reducing the use of highly repetitive strings values through normalized keys, removing unused columns, or upgrading to Power BI Premium.
Solved! Go to Solution.
Hi @BITSMH ,
You should check this as following.
Firstly, about whether the premium is set correctly. There are various sku premiums and they have different limits on the size of the dataset .
Secondly, make sure that the workload behavior of premium capacity is configured correctly in the admin portal.
service admin premium workloads
Thirdly, enable large dataset in power bi permium. Large datasets in the service do not affect the Power BI Desktop model upload size, which is still limited to 10 GB. Instead, datasets can grow beyond that limit in the service on refresh.
Fourth, as the error message says, reduce the use of highly constant, long string values and instead use a normalized key. Add a primary key and move most of the calculations to DAX to limit datasets size, or refer to the method described in this article (How to Minimize Data Load Size for Tables in Power BI ).
Best Regards
Community Support Team _ chenwu zhu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @BITSMH ,
You should check this as following.
Firstly, about whether the premium is set correctly. There are various sku premiums and they have different limits on the size of the dataset .
Secondly, make sure that the workload behavior of premium capacity is configured correctly in the admin portal.
service admin premium workloads
Thirdly, enable large dataset in power bi permium. Large datasets in the service do not affect the Power BI Desktop model upload size, which is still limited to 10 GB. Instead, datasets can grow beyond that limit in the service on refresh.
Fourth, as the error message says, reduce the use of highly constant, long string values and instead use a normalized key. Add a primary key and move most of the calculations to DAX to limit datasets size, or refer to the method described in this article (How to Minimize Data Load Size for Tables in Power BI ).
Best Regards
Community Support Team _ chenwu zhu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
try running your data model by dax studio or tabular editor where you can run best practice check on your dataset and run performance test to have a better understanding of where its lacking the performance and failing
Proud to be a Super User!
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.