Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
dbeavon3
Continued Contributor
Continued Contributor

PipelineException when running dataflow. CompressedDataSize exceeds!

Another day, and another strange message from PQ online.  I was trying to refresh a PBI online dataflow, and received the following:

 

PipelineException: With compression algorithm the compressed data size in a packet exceeds the max ServiceBus limit: GatewayCompressor - CompressedDataSize (25906718) of a non-compressed packet exceeds the maximum payload size of 8500000 .

 

I'm not sure how to resolve this exactly.  I was pulling a very large chunk of json, and trying to encode it in a dataflow-table for subsequent use in another table (a "computed entity").  However it appears that PQ is not willing or able to help me out. 

Does anyone have any idea how to send a very large chuck of json out into this storage account where my dataflows live? 

 

I wish this error message was more meaningful.  Most people shouldn't have to know or care about "servicebus", nor should they worry about bumping into some random/arbitrary 8.5 MB payload maximum. 

 

Any help would be appreciated.  This error doesn't come up in any google searches related to Power BI.

5 REPLIES 5
dbeavon3
Continued Contributor
Continued Contributor

While trying to fix this, I get another error with a similar format.

 

This time its complaining about uncompressed sizes.

 

PipelineException: The uncompressed data size specified in a packet header exceeds max limit: GatewayDecompressor - Header.UncompressedDataSize (193364724) of a compressed packet exceeds the maximum allowed uncompressed payload of 157286421 .

 

 

aj1973
Community Champion
Community Champion

Gateways and APIs have payload size limits, AWS for instance

aj1973_0-1675196863146.png

aj1973_1-1675196961310.png

 

I think moving to Premium capacity could help you out.

Regards
Amine Jerbi

If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook

dbeavon3
Continued Contributor
Continued Contributor

We are already on premium.

 

I really wish dataflows were more user-friendly.  There should be a way to serialize arbitrary objects out there, for the sake of related entities ("computed entities").

 

These CSV-compatibility requirements are just plain obnoxious.  There should be a way to save XML, JSON, text, and lots more.  Even binary formats like parquet would be very helpful!

aj1973
Community Champion
Community Champion

you can submit your idea/issue in here

https://community.powerbi.com/t5/Issues/idb-p/Issues

Who knows they may help you out

Regards
Amine Jerbi

If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook

dbeavon3
Continued Contributor
Continued Contributor

Thanks for the tip @aj1973 

Who knows?  I know.  I'm still waiting for some "public previews" that have been a work-in-progress for a couple years.  I will wait for the more important items to be fixed before adding to the backlog.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors