Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi all,
I've using the Unzip functions from @artemis and @lbendlin and this has been ok for files smaller than 4 GB.
However, I have to source some very large zip files on on Azure datalake, one which is 6.5GB zipped and 52 GB unzipped, and many more over 4 GB zipped. Unfortunately these files fail with either of these two messages:
Standard Unzip script:
An error occurred in the ‘’ query. Expression.Error: The number is out of range of a 32 bit integer value.
Details:
6550844488
Unzip script based on Binary.Buffer:
An error occurred in the ‘’ query. DataFormat.Error: The number of items in the list is too large. Buffered lists can support up to 2147483647 items and streamed lists can support up to 4503599627370496 items.
Details:
2147483648
Is there any unzip solution for 4GB + very large 64 bit zip files? Or a way to alter the existing script to make them support these massive files?
No idea what streamed lists are but these seem to be your only hope. This would require the decompression algorithm to be able to work on a stream rather than a blob.
Thanks @lbendlin, I've not used streaming before and is a new challenge.
Any pointers from anyone who has had this challenge welcomed. I only need to crunch the metadata from the massive CSV's in the ZIP files, so streaming it in and being able to run a Table.Profile at the minimum is my goal.
If you only need the meta data then you can check where in your ZIP files the directory is (beginning or end) and then read only that part. My article explains where to look and how to do it.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 19 | |
| 9 | |
| 8 | |
| 7 | |
| 6 |