Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric certified for FREE! Don't miss your chance! Learn more
Hello @Ka13
If you don't need to write in Delta format you can simply write your pandas dataset using the "File API Path". Add the lakehouse as a Data item in your Python notebook first. In the /Files section of the lakehouse, click the three dots on the folder you want to write to, and from the context menu copy the File API Path option. You can simply use that with Pandas to_csv method to write the dataset into the folder.
df.to_csv('/lakehouse/default/Files/pandas_data/customers.csv', index=False, encoding="utf-8")
For writing in delta file format it's best to use a Spark notebook and the abfs file path, because Delta is a Spark-native storage format!
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Fabric update to learn about new features.
| User | Count |
|---|---|
| 19 | |
| 5 | |
| 4 | |
| 3 | |
| 3 |
| User | Count |
|---|---|
| 68 | |
| 28 | |
| 14 | |
| 8 | |
| 7 |