Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
arkiboys
Advocate II
Advocate II

save data to .csv

 
hello,

in fabric dataflow, after some transformations, I have got the data.

Now I would like to save this data as csv.

How do I do this in notebook? 

thanks

1 ACCEPTED SOLUTION
v-nikhilan-msft
Community Support
Community Support

Hi again @arkiboys 
Thanks for using Fabric Community.
Unfortunately, Fabric Dataflow (Gen2) itself doesn't have a built-in functionality to directly save the transformed data as an CSV file. However, you can achieve this by integrating Dataflow with Notebooks:

1) Give the destination as lakehouse in Dataflow Gen2 and publish the Dataflow . This will create a new table in your lakehouse.

vnikhilanmsft_0-1717135202266.png

 

 

vnikhilanmsft_1-1717135201998.png

 


2) Create a new notebook and run the below code:

vnikhilanmsft_2-1717135706763.png

 

df = spark.sql("SELECT * FROM lakehouse_1.customers_1000 ")
df.write.option("header",True).csv("Files/customersnew.csv")
 
3) A file will be created in the lakehouse files section:

vnikhilanmsft_3-1717135755322.png


Hope this helps. Please let me know if you have any further questions.

View solution in original post

6 REPLIES 6
NandanHegde
Super User
Super User

Adding on top of what @v-nikhilan-msft stated, you can even use data pipelines to copy the data from lakehouse and save this data as csv in the sink supported by data pipeline if need be.

Any specific reason why you want to integrate notebook?




----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

please do let me know how to save the data as .csv in the pipeline

thank you

you can use Copy activity for the same post dataflow trigger

NandanHegde_0-1717164905199.pngNandanHegde_1-1717164931887.png

where destination can be any sink supported by data pipelines.
In my example I have taken lakehouse only




----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com
v-nikhilan-msft
Community Support
Community Support

Hi again @arkiboys 
Thanks for using Fabric Community.
Unfortunately, Fabric Dataflow (Gen2) itself doesn't have a built-in functionality to directly save the transformed data as an CSV file. However, you can achieve this by integrating Dataflow with Notebooks:

1) Give the destination as lakehouse in Dataflow Gen2 and publish the Dataflow . This will create a new table in your lakehouse.

vnikhilanmsft_0-1717135202266.png

 

 

vnikhilanmsft_1-1717135201998.png

 


2) Create a new notebook and run the below code:

vnikhilanmsft_2-1717135706763.png

 

df = spark.sql("SELECT * FROM lakehouse_1.customers_1000 ")
df.write.option("header",True).csv("Files/customersnew.csv")
 
3) A file will be created in the lakehouse files section:

vnikhilanmsft_3-1717135755322.png


Hope this helps. Please let me know if you have any further questions.

If I understand correctly, this approach will create a folder with multiple files.

 

To create a single file from a Notebook, please refer to these threads:

 

https://community.fabric.microsoft.com/t5/Data-Engineering/How-do-I-just-write-a-CSV-file-to-a-lakeh...

 

https://community.fabric.microsoft.com/t5/Data-Engineering/Progrmatically-write-files-in-delta/m-p/4...

 

I also think @NandanHegde's solution is a good solution which doesn't involve Notebook.

By any chance do you if the Notebook code can be updated to dynamically append the field "File_Instance" to the output name?

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.

Top Solution Authors