Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
smpa01
Super User
Super User

DeltaTableBuilder without Lakehouse Context

delta =DeltaTable.createIfNotExists(spark).tableName("lakehouse_testing.t1").addColumn("id",IntegerType()).execute()

## the above fails but I see the following method doc
(method) def tableName(identifier: str) -> DeltaTableBuilder
Specify the table name. Optionally qualified with a database name [database_name.] table_name.

:param identifier: the table name
:type identifier: str
:return: this builder

# Error - Spark SQL queries are only possible in the context of a lakehouse. Please attach a lakehouse to proceed.

 

Can I build a DeltaTable without attaching a lakehouse at all?

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
1 ACCEPTED SOLUTION
v-lgarikapat
Community Support
Community Support

Hi @smpa01 ,

Thanks for reaching out to the Microsoft fabric community forum.

@BhaveshPatel 

Thanks for your prompt response

 In Addition to @BhaveshPatel ,

 

In Microsoft Fabric not quite. The DeltaTableBuilder API requires an attached Lakehouse context to function properly. Without it, you'll hit errors like:

  • Spark SQL queries are only possible in the context of a lakehouse
  • SCHEMA_NOT_FOUND or IllegalArgumentException when resolving table paths

Workaround: Use File Paths Instead
If you're working outside Fabric (e.g., Databricks, Synapse, or even Fabric with advanced setup), you can bypass the Lakehouse attachment by writing directly to a file path:

df.write \
.format("delta") \
.mode("overwrite") \
.save("abfss://<workspace_id>@onelake.dfs.fabric.microsoft.com/<lakehouse_id>/Tables/<table_name>")

 

This approach uses OneLake ABFS paths and doesn’t require the Lakehouse to be attached in the notebook UI.

 

 Alternative: Use CREATE EXTERNAL TABLE or %sql Magic You can also define Delta tables using SQL magic or external table syntax:

%sql

CREATE TABLE my_table (

  id INT

) USING DELTA

LOCATION 'abfss://...'

 

This works only when a Lakehouse is attached

vlgarikapat_0-1752573452383.png

vlgarikapat_1-1752573733439.pngvlgarikapat_2-1752573765431.png

 

 

 

If you are still facing any challenges, we would be happy to assist you further.

We appreciate your engagement and thank you for being an active member of the community.

Best regards,
LakshmiNarayana

 

View solution in original post

3 REPLIES 3
smpa01
Super User
Super User

Thanks for the response. Just for the context, there are several ways to create a table and DeltaTableBuilder API is one of them. I am using it cause sparkDataFrameWriter can't quite do what DeltaTableBuilderAPI can do. 

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
v-lgarikapat
Community Support
Community Support

Hi @smpa01 ,

Thanks for reaching out to the Microsoft fabric community forum.

@BhaveshPatel 

Thanks for your prompt response

 In Addition to @BhaveshPatel ,

 

In Microsoft Fabric not quite. The DeltaTableBuilder API requires an attached Lakehouse context to function properly. Without it, you'll hit errors like:

  • Spark SQL queries are only possible in the context of a lakehouse
  • SCHEMA_NOT_FOUND or IllegalArgumentException when resolving table paths

Workaround: Use File Paths Instead
If you're working outside Fabric (e.g., Databricks, Synapse, or even Fabric with advanced setup), you can bypass the Lakehouse attachment by writing directly to a file path:

df.write \
.format("delta") \
.mode("overwrite") \
.save("abfss://<workspace_id>@onelake.dfs.fabric.microsoft.com/<lakehouse_id>/Tables/<table_name>")

 

This approach uses OneLake ABFS paths and doesn’t require the Lakehouse to be attached in the notebook UI.

 

 Alternative: Use CREATE EXTERNAL TABLE or %sql Magic You can also define Delta tables using SQL magic or external table syntax:

%sql

CREATE TABLE my_table (

  id INT

) USING DELTA

LOCATION 'abfss://...'

 

This works only when a Lakehouse is attached

vlgarikapat_0-1752573452383.png

vlgarikapat_1-1752573733439.pngvlgarikapat_2-1752573765431.png

 

 

 

If you are still facing any challenges, we would be happy to assist you further.

We appreciate your engagement and thank you for being an active member of the community.

Best regards,
LakshmiNarayana

 

BhaveshPatel
Community Champion
Community Champion

You are missing two datasets:

# Data Lake First

sdf = spark.createDataFrame(df)

# Data Lakehouse

# Writing a DataFrame to Delta format

sdf.write.format("delta").option("overwrite").saveAsTable("DimTable")

 

 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.