Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Is there any way to get a .csv onto my computer from a delta table in a lakehouse? I haven't found any viable approach so far despite trying.
Solved! Go to Solution.
One way to achieve this is to write a small Python script in a notebook that reads the delta table from your lakehouse and exports it to the Files section of your lakehouse. It may be convinient to make a pandas dataframe after you've read the delta table with spark and use pandas 'to_csv' functionality.
With the OneLake file explorer: https://www.microsoft.com/en-us/download/details.aspx?id=105222, you could view and copy this csv to your local directory on your computer.
One way to achieve this is to write a small Python script in a notebook that reads the delta table from your lakehouse and exports it to the Files section of your lakehouse. It may be convinient to make a pandas dataframe after you've read the delta table with spark and use pandas 'to_csv' functionality.
With the OneLake file explorer: https://www.microsoft.com/en-us/download/details.aspx?id=105222, you could view and copy this csv to your local directory on your computer.
Brilliant, it works. FYI, if I use
df.write.csv("Files/my_csv.csv")
to write the file, it keeps the delta table file structure, i.e. it makes a folder named "my_csv.csv". However, the file(s) are in csv, so for my purposes it just works.
If you use Pandas to_csv(), instead of PySpark write.csv(), you will get a single CSV file.
https://learn.microsoft.com/en-us/fabric/data-science/read-write-pandas
Pandas cannot work with such large data volumes as Spark.
https://www.reddit.com/r/MicrosoftFabric/s/7EloDpd8wI
If your amount of data is not too large volume, then I guess I would use Pandas for writing to a single csv.
But seriously, converting a Delta file back to CSV is a travesty. Can your application not work with Parquet?
My goals are beyond your understanding.
The Delta tables are already in Parquet format. it would be counterproductive to convert that to CSV.
"the entire table" - careful with that. Parquet supports versioning so you risk exporting all the prior versions too.
There is only one version.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
6 | |
4 | |
3 | |
3 | |
3 |