Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
VenDaFabricator
Resolver I
Resolver I

Creating zip file in notebook

I got bunch of .delta table which in need to save as .json and zip them to supply it thrid party tool for futher processing.

 

I was able to get all .delta tables into .json files, but getting erro when zipping them.

My .json files are in folder 

 

'Files/temp/consolidated/'

 

 

VenDaFabricator_0-1729999608512.png

Now i want them to zip them all into 'sample.zip' ... he is my code, not sure what am doing wrong zipf.write can find the json_file_path though is right.  Any clues?? or is there different way to do it?

 

 

with zipfile.ZipFile('sample.zip', 'w') as zipf:
    for f in filenames:
        json_file_path = f"Files/temp/consolidated/{f}"
        print (json_file_path)
        zipf.write(json_file_path,arcname=f)

 

 

 

Microsoft Fabric Context: The Fabric environment uses a virtual file system, which doesn’t provide direct file path access as in traditional file systems.

zipfile.ZipFile.write Limitation: This method expects a direct file path, which isn't available for Fabric paths.

 

 

7 REPLIES 7
richbenmintz
Resident Rockstar
Resident Rockstar

Why not use a copy command in a pipeline, you can iterate over all the table you want.

Delta Table as you source

OneLake File as your destination json file type, set the compression type to .zip

richbenmintz_0-1730744049588.png

 



I hope this helps,
Richard

Did I answer your question? Mark my post as a solution! Kudos Appreciated!

Proud to be a Super User!


Anonymous
Not applicable

Hi @VenDaFabricator 

 

Have you resolved this issue? If any of the answers provided were helpful, please consider accepting them as a solution. If you have found other solutions, we would greatly appreciate it if you could share them with us. Thank you!

 

Best Regards,
Jing

Anonymous
Not applicable

Hi @VenDaFabricator 

 

Maybe you can mount the lakehouse through the Microsoft Spark Utilities package? Mount would allow you to use the local file API to access data under the mount point as if it's stored in the local file system.

How to mount a lakehouse 

Access files under the mount point via local path 

 

I haven't used it to create zip files, you can give it a try.

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

lbendlin
Super User
Super User

not sure what am doing wrong

It's pretty clear that what you are trying to do is not supported. Don't use zipFile - use its in-memory equivalent, whatever that is.

Thanks lbendlin for the reply, do you got any sample code using in-memory to do this zipping for files or any links.

 

 

Thanks @lbendlin 👍 ... will give a try and see how it goes.  👍

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.