Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
Anonymous
Not applicable

Export entire delta table from lakehouse

Is there any way to get a .csv onto my computer from a delta table in a lakehouse? I haven't found any viable approach so far despite trying.

1 ACCEPTED SOLUTION
FabianSchut
Super User
Super User

One way to achieve this is to write a small Python script in a notebook that reads the delta table from your lakehouse and exports it to the Files section of your lakehouse. It may be convinient to make a pandas dataframe after you've read the delta table with spark and use pandas 'to_csv' functionality.
With the OneLake file explorer: https://www.microsoft.com/en-us/download/details.aspx?id=105222, you could view and copy this csv to your local directory on your computer.

View solution in original post

7 REPLIES 7
FabianSchut
Super User
Super User

One way to achieve this is to write a small Python script in a notebook that reads the delta table from your lakehouse and exports it to the Files section of your lakehouse. It may be convinient to make a pandas dataframe after you've read the delta table with spark and use pandas 'to_csv' functionality.
With the OneLake file explorer: https://www.microsoft.com/en-us/download/details.aspx?id=105222, you could view and copy this csv to your local directory on your computer.

Anonymous
Not applicable

Brilliant, it works. FYI, if I use

df.write.csv("Files/my_csv.csv")

to write the file, it keeps the delta table file structure, i.e. it makes a folder named "my_csv.csv". However, the file(s) are in csv, so for my purposes it just works.

If you use Pandas to_csv(), instead of PySpark write.csv(), you will get a single CSV file.

 

https://learn.microsoft.com/en-us/fabric/data-science/read-write-pandas

 

Pandas cannot work with such large data volumes as Spark. 

https://www.reddit.com/r/MicrosoftFabric/s/7EloDpd8wI

 

If your amount of data is not too large volume, then I guess I would use Pandas for writing to a single csv.

But seriously, converting a Delta file back to CSV is a travesty. Can your application not work with Parquet?

Anonymous
Not applicable

My goals are beyond your understanding. 

¯\_(ツ)_/¯

lbendlin
Super User
Super User

The Delta tables are already in Parquet format. it would be counterproductive to convert that to CSV.

 

"the entire table"  - careful with that. Parquet supports versioning so you risk exporting all the prior versions too.

Anonymous
Not applicable

There is only one version.

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors