Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hello,
I am trying to understand how to deploy the lakehouse objects from one workspace to the other, and one of the things I need to deploy are the schemas created.
How can I create schemas on a lakehouse using a pipeline or a notebook?
Also, is there how to create tables?
I thought there was some kind of API we can use to deploy from one place to another.
And on top of that, I am trying to integrate everything with devops.
thanks
Solved! Go to Solution.
Hi @AstridM ,
A new schema can be created by running a SQL statement directly in the SQL endpoint corresponding to lakehouse.
CREATE SCHEMA test_schema
You can only create tables in the dbo schema.
# create table
spark.sql("""
CREATE TABLE test2 (
id INT,
name STRING,
age INT
)
""")
# insert data
spark.sql("""
INSERT INTO test2 VALUES
(1, 'Alice', 30),
(2, 'Bob', 25),
(3, 'Charlie', 35)
""")
The final running result is shown in the figure below:
If you want to create new tables in a new schema, I have an alternative:
Create the table in the dbo schema and move it to the desired schema after creation.
For example, I have a table named test in dbo. Using the following code, you can move it to a schema named test_schema.
ALTER SCHEMA test_schema TRANSFER dbo.test
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Good morning, thanks, I follow your logic and I was able to do mostly everything, I was not able to transfer the table on spark, but I was able to create the schema and a table.
Hi @AstridM ,
A new schema can be created by running a SQL statement directly in the SQL endpoint corresponding to lakehouse.
CREATE SCHEMA test_schema
You can only create tables in the dbo schema.
# create table
spark.sql("""
CREATE TABLE test2 (
id INT,
name STRING,
age INT
)
""")
# insert data
spark.sql("""
INSERT INTO test2 VALUES
(1, 'Alice', 30),
(2, 'Bob', 25),
(3, 'Charlie', 35)
""")
The final running result is shown in the figure below:
If you want to create new tables in a new schema, I have an alternative:
Create the table in the dbo schema and move it to the desired schema after creation.
For example, I have a table named test in dbo. Using the following code, you can move it to a schema named test_schema.
ALTER SCHEMA test_schema TRANSFER dbo.test
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!