Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric certified for FREE! Don't miss your chance! Learn more
Hi @Ka13 ,
You’re getting this error because the deltalake (delta‑rs) library does not support name‑based ABFSS paths in Fabric. It requires the Workspace ID and Lakehouse Item ID, not the workspace or lakehouse name.
You can get both IDs from URL.
Eg. https://app.fabric.microsoft.com/groups/<workspace-id>/lakehouses/<lakehouse-id>?sparkUpgradeToFabric=1&experience=fabric-developer
The correct ABFSS format is:
@Asmita_27 - thanks for your reply. How to get the workspace-id and Lakehouse-id in fabric python notebook? , to get the workspace-id and Lakehouse id and then pass it to url
Hello @Ka13 I doubt you'd be able to import mssparkutils as it is a Spark utility and you're using a Python notebook.
Here's a code you can use in Python notebook -
import sempy.fabric as fabric
def get_workspace_and_lakehouse_id(target_name="my_lakehouse"):
# Workspace ID
workspace_id = fabric.get_notebook_workspace_id()
# List lakehouse items (your exact signature)
items = fabric.list_items(type="Lakehouse", workspace=workspace_id)
lakehouse_id = None
# Loop through rows (your structure: it[0] = id, it[1] = name)
for it in items.values:
if it[1] == target_name:
lakehouse_id = it[0]
break
return workspace_id, lakehouse_id
print(get_workspace_and_lakehouse_id())
Hi,
from notebookutils import mssparkutils
Hello @Ka13
If you don't need to write in Delta format you can simply write your pandas dataset using the "File API Path". Add the lakehouse as a Data item in your Python notebook first. In the /Files section of the lakehouse, click the three dots on the folder you want to write to, and from the context menu copy the File API Path option. You can simply use that with Pandas to_csv method to write the dataset into the folder.
df.to_csv('/lakehouse/default/Files/pandas_data/customers.csv', index=False, encoding="utf-8")
For writing in delta file format it's best to use a Spark notebook and the abfs file path, because Delta is a Spark-native storage format!
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Fabric update to learn about new features.
| User | Count |
|---|---|
| 24 | |
| 5 | |
| 4 | |
| 3 | |
| 3 |
| User | Count |
|---|---|
| 71 | |
| 28 | |
| 14 | |
| 8 | |
| 7 |