Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
AstridM
Advocate I
Advocate I

Schema creation in a lakehouse deployment

Hello,

I am trying to understand how to deploy the lakehouse objects from one workspace to the other, and one of the things I need to deploy are the schemas created.

How can I create schemas on a lakehouse using a pipeline or a notebook?

Also, is there how to create tables?

I thought there was some kind of API we can use to deploy from one place to another.

And on top of that, I am trying to integrate everything with devops.

thanks

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @AstridM ,

 

A new schema can be created by running a SQL statement directly in the SQL endpoint corresponding to lakehouse.

CREATE SCHEMA test_schema

vhuijieymsft_0-1732679475730.png

 

You can only create tables in the dbo schema.

vhuijieymsft_1-1732679475732.png

# create table
spark.sql("""
CREATE TABLE test2 (
id INT,
name STRING,
age INT
)
""")

# insert data
spark.sql("""
INSERT INTO test2 VALUES
(1, 'Alice', 30),
(2, 'Bob', 25),
(3, 'Charlie', 35)
""")

 

The final running result is shown in the figure below:

vhuijieymsft_2-1732679534618.png

 

If you want to create new tables in a new schema, I have an alternative:

 

Create the table in the dbo schema and move it to the desired schema after creation.

 

For example, I have a table named test in dbo. Using the following code, you can move it to a schema named test_schema.

ALTER SCHEMA test_schema TRANSFER dbo.test

vhuijieymsft_3-1732679534623.png

 

If you have any other questions please feel free to contact me.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

View solution in original post

2 REPLIES 2
AstridM
Advocate I
Advocate I

Good morning, thanks, I follow your logic and I was able to do mostly everything, I was not able to transfer the table on spark, but I was able to create the schema and a table.

 
AstridM_0-1732716408479.png

 

 

Anonymous
Not applicable

Hi @AstridM ,

 

A new schema can be created by running a SQL statement directly in the SQL endpoint corresponding to lakehouse.

CREATE SCHEMA test_schema

vhuijieymsft_0-1732679475730.png

 

You can only create tables in the dbo schema.

vhuijieymsft_1-1732679475732.png

# create table
spark.sql("""
CREATE TABLE test2 (
id INT,
name STRING,
age INT
)
""")

# insert data
spark.sql("""
INSERT INTO test2 VALUES
(1, 'Alice', 30),
(2, 'Bob', 25),
(3, 'Charlie', 35)
""")

 

The final running result is shown in the figure below:

vhuijieymsft_2-1732679534618.png

 

If you want to create new tables in a new schema, I have an alternative:

 

Create the table in the dbo schema and move it to the desired schema after creation.

 

For example, I have a table named test in dbo. Using the following code, you can move it to a schema named test_schema.

ALTER SCHEMA test_schema TRANSFER dbo.test

vhuijieymsft_3-1732679534623.png

 

If you have any other questions please feel free to contact me.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.