Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Score big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount

Reply
bvbull200
Helper III
Helper III

Understanding the Error: The amount of data on the gateway client has exceeded the limit...

I have tried reading through various threads about how to "fix" the error of "The amount of data on the gateway client has exceeded the limit for a single table" when refreshing a dataset through PBI service, but I still can't quite wrap my head around what is happening.

 

I have a csv file that is 3.1 GB large.  It is output from Qlikview where all of the transformation takes place.  I load that csv to Power BI and do no additional transformation other than the "Change Type" step to get the columns formatted properly.  There are no other steps beyond this.  The query is just the csv file.

 

I can then publish the dataset to a workspace, no problem.

 

I go to refresh and I get the error above.  In reading through, it says that the limit for a single table is 10GB.  So what gives?  How did I go from a 3.1 GB table, to a 0.25 GB pbix file, to a table that is over 10GB?  

 

I'm paring down columns and data right now best I can, but I'm struggling to understand exactly how I got here in the first place.

 

Any insight is greatly appreciated. 

1 ACCEPTED SOLUTION
v-cazheng-msft
Community Support
Community Support

Hi @bvbull200,

 

Power BI stores the imported data in a highly compressed way and the compressed rate may even up to 10 times. That’s why you see the pbix file is small even the original data amount is large.

 

The 10GB limit mentioned in the error is not the size of the pbix/model as their data is compressed and it’s the raw data(fetched by the queries generated in Power BI service) that needs to be transferred through the gateway during the refresh in order to process the model in the service. It’s a by design limitation and the product group is considering make the error message more clear in the future to help the troubleshoot.

 

Currently, please consider these solutions.

1 Remove high cardinality columns with DAX Studio tool

Connect to the model/pbix file>Go to Advanced tab>View Metrics>Remove the columns with the highest cardinality

vcazhengmsft_0-1655172182302.png

 

2 Remove some unnecessary data or columns from the model

3 Upgrade your workspace capacity to Premium capacity

For more details, you may refer to Pricing & Product Comparison | Microsoft Power BI.

 

If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly. If I misunderstand your needs or you still have problems on it, please feel free to let me know. Thanks a lot!

 

Best Regards,

View solution in original post

2 REPLIES 2
v-cazheng-msft
Community Support
Community Support

Hi @bvbull200,

 

Power BI stores the imported data in a highly compressed way and the compressed rate may even up to 10 times. That’s why you see the pbix file is small even the original data amount is large.

 

The 10GB limit mentioned in the error is not the size of the pbix/model as their data is compressed and it’s the raw data(fetched by the queries generated in Power BI service) that needs to be transferred through the gateway during the refresh in order to process the model in the service. It’s a by design limitation and the product group is considering make the error message more clear in the future to help the troubleshoot.

 

Currently, please consider these solutions.

1 Remove high cardinality columns with DAX Studio tool

Connect to the model/pbix file>Go to Advanced tab>View Metrics>Remove the columns with the highest cardinality

vcazhengmsft_0-1655172182302.png

 

2 Remove some unnecessary data or columns from the model

3 Upgrade your workspace capacity to Premium capacity

For more details, you may refer to Pricing & Product Comparison | Microsoft Power BI.

 

If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly. If I misunderstand your needs or you still have problems on it, please feel free to let me know. Thanks a lot!

 

Best Regards,

GilbertQ
Super User
Super User

Hi @bvbull200 

 

This could happen because when the data is sent via the gateway it is in an umcompressed state. 

The other potential reason is because it has to load the entire dataset into memory it has to send the CSV file completly before it can then do the "Change Type"

 

What happens if you remove the step?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors