Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Another day, and another strange message from PQ online. I was trying to refresh a PBI online dataflow, and received the following:
PipelineException: With compression algorithm the compressed data size in a packet exceeds the max ServiceBus limit: GatewayCompressor - CompressedDataSize (25906718) of a non-compressed packet exceeds the maximum payload size of 8500000 .
I'm not sure how to resolve this exactly. I was pulling a very large chunk of json, and trying to encode it in a dataflow-table for subsequent use in another table (a "computed entity"). However it appears that PQ is not willing or able to help me out.
Does anyone have any idea how to send a very large chuck of json out into this storage account where my dataflows live?
I wish this error message was more meaningful. Most people shouldn't have to know or care about "servicebus", nor should they worry about bumping into some random/arbitrary 8.5 MB payload maximum.
Any help would be appreciated. This error doesn't come up in any google searches related to Power BI.
While trying to fix this, I get another error with a similar format.
This time its complaining about uncompressed sizes.
PipelineException: The uncompressed data size specified in a packet header exceeds max limit: GatewayDecompressor - Header.UncompressedDataSize (193364724) of a compressed packet exceeds the maximum allowed uncompressed payload of 157286421 .
Gateways and APIs have payload size limits, AWS for instance
I think moving to Premium capacity could help you out.
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
We are already on premium.
I really wish dataflows were more user-friendly. There should be a way to serialize arbitrary objects out there, for the sake of related entities ("computed entities").
These CSV-compatibility requirements are just plain obnoxious. There should be a way to save XML, JSON, text, and lots more. Even binary formats like parquet would be very helpful!
you can submit your idea/issue in here
https://community.powerbi.com/t5/Issues/idb-p/Issues
Who knows they may help you out
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
Thanks for the tip @aj1973
Who knows? I know. I'm still waiting for some "public previews" that have been a work-in-progress for a couple years. I will wait for the more important items to be fixed before adding to the backlog.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
58 | |
30 | |
24 | |
23 | |
20 |
User | Count |
---|---|
54 | |
33 | |
22 | |
20 | |
20 |