Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
delta =DeltaTable.createIfNotExists(spark).tableName("lakehouse_testing.t1").addColumn("id",IntegerType()).execute()
## the above fails but I see the following method doc
(method) def tableName(identifier: str) -> DeltaTableBuilder
Specify the table name. Optionally qualified with a database name [database_name.] table_name.
:param identifier: the table name
:type identifier: str
:return: this builder
# Error - Spark SQL queries are only possible in the context of a lakehouse. Please attach a lakehouse to proceed.
Can I build a DeltaTable without attaching a lakehouse at all?
Solved! Go to Solution.
Hi @smpa01 ,
Thanks for reaching out to the Microsoft fabric community forum.
Thanks for your prompt response
In Addition to @BhaveshPatel ,
In Microsoft Fabric not quite. The DeltaTableBuilder API requires an attached Lakehouse context to function properly. Without it, you'll hit errors like:
Workaround: Use File Paths Instead
If you're working outside Fabric (e.g., Databricks, Synapse, or even Fabric with advanced setup), you can bypass the Lakehouse attachment by writing directly to a file path:
df.write \
.format("delta") \
.mode("overwrite") \
.save("abfss://<workspace_id>@onelake.dfs.fabric.microsoft.com/<lakehouse_id>/Tables/<table_name>")
This approach uses OneLake ABFS paths and doesn’t require the Lakehouse to be attached in the notebook UI.
Alternative: Use CREATE EXTERNAL TABLE or %sql Magic You can also define Delta tables using SQL magic or external table syntax:
%sql
CREATE TABLE my_table (
id INT
) USING DELTA
LOCATION 'abfss://...'
This works only when a Lakehouse is attached
If you are still facing any challenges, we would be happy to assist you further.
We appreciate your engagement and thank you for being an active member of the community.
Best regards,
LakshmiNarayana
Thanks for the response. Just for the context, there are several ways to create a table and DeltaTableBuilder API is one of them. I am using it cause sparkDataFrameWriter can't quite do what DeltaTableBuilderAPI can do.
Hi @smpa01 ,
Thanks for reaching out to the Microsoft fabric community forum.
Thanks for your prompt response
In Addition to @BhaveshPatel ,
In Microsoft Fabric not quite. The DeltaTableBuilder API requires an attached Lakehouse context to function properly. Without it, you'll hit errors like:
Workaround: Use File Paths Instead
If you're working outside Fabric (e.g., Databricks, Synapse, or even Fabric with advanced setup), you can bypass the Lakehouse attachment by writing directly to a file path:
df.write \
.format("delta") \
.mode("overwrite") \
.save("abfss://<workspace_id>@onelake.dfs.fabric.microsoft.com/<lakehouse_id>/Tables/<table_name>")
This approach uses OneLake ABFS paths and doesn’t require the Lakehouse to be attached in the notebook UI.
Alternative: Use CREATE EXTERNAL TABLE or %sql Magic You can also define Delta tables using SQL magic or external table syntax:
%sql
CREATE TABLE my_table (
id INT
) USING DELTA
LOCATION 'abfss://...'
This works only when a Lakehouse is attached
If you are still facing any challenges, we would be happy to assist you further.
We appreciate your engagement and thank you for being an active member of the community.
Best regards,
LakshmiNarayana
You are missing two datasets:
# Data Lake First
sdf = spark.createDataFrame(df)
# Data Lakehouse
# Writing a DataFrame to Delta format
sdf.write.format("delta").option("overwrite").saveAsTable("DimTable")
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Fabric update to learn about new features.
User | Count |
---|---|
17 | |
5 | |
4 | |
2 | |
2 |