Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
Anonymous
Not applicable

Creating schemas in lakehouse through pipeline + creating tables inside schemas

Hi,

Our application serves different tenants and I was looking for a way to be able to isolate each tenant data in our lakeHouse.

Previously in azure synapse analytics, I was able to create a lake database per each tenant and I'm looking for a similar solution in fabric.

My question is 2 fold:

  1. Is it possible to create a schema through pipelines/notebooks in fabric?
    I have created a stored procdure in sql end point in dbo schema, which creates a new schemaAmirHeidari_0-1716824091821.png

    I can't seem to find a way to execute this through pipeline as the script activity or stored procedure seem to only execute within external azure DB or a fabric warehouse but not in the lakehouse. Has anyone found a way to execute such scripts against the lakehouse.

  2. Even if I execute above script manually to create the schema, I cannot use notebooks to create delta tables inside that schema.
    For example I have creatd a schema called fd7311e6. 
    AmirHeidari_1-1716824346682.png

    If I want to create a delta table within that schema, I get the following error:

     

    AmirHeidari_2-1716824376010.png

    Of course, so far I have tried to create tables and distinguish them by the naming convention. But what I ultimately need to do is to move my analytics table into their own schema and then serving them to our BI.

    AmirHeidari_3-1716824644177.png

     

    Thanks 

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @Anonymous 
Thanks for using Fabric Community.
1) You can connect to the Lakehouse using the SQL endpoint connection string by following the below steps:

vnikhilanmsft_0-1716872245579.png


Copy the sql connection string:

vnikhilanmsft_1-1716872288247.png


Create a new pipeline and select the stored procedure activity. In the settings pane select more:

vnikhilanmsft_2-1716872363391.png


In the onelake hub you will find the exact same sql connection string:

vnikhilanmsft_3-1716872526909.png


Select it and give the credentials. You can find the desired lakehouse and the stored procedure in the dropdown:

vnikhilanmsft_4-1716872706256.png

 

vnikhilanmsft_5-1716872728412.png


In this way you can execute the stored procedure present in Lakehouse.

2) Using notebooks you can create tables only in the dbo schema. You cannot create new tables inside another schema using Notebooks. 
For now you can follow this work around:

Create the tables in dbo schema and move them into desired schema after creation.
I have a table named customers_1000 in dbo. Using the below code you can move it into schema named MAIN.

vnikhilanmsft_6-1716875995093.png
vnikhilanmsft_7-1716876016025.png

 

By following these steps you can move the tables into desired schema.
Hope this helps. Please let me know if you have any further queries.

View solution in original post

3 REPLIES 3
Anonymous
Not applicable

Thanks
I need to figure out some access related issues related to service principal authentictation but anyhow schema transfer works in my case.
Actually I remember I had a similar case in synaspe as well that I need to run some scripts in serverless sql using pipelines. I ended up also creating serverless sql as a linked service and it worked. So this solution is pretty similar to that one as well

Anonymous
Not applicable

Hi @Anonymous 
Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.

Anonymous
Not applicable

Hi @Anonymous 
Thanks for using Fabric Community.
1) You can connect to the Lakehouse using the SQL endpoint connection string by following the below steps:

vnikhilanmsft_0-1716872245579.png


Copy the sql connection string:

vnikhilanmsft_1-1716872288247.png


Create a new pipeline and select the stored procedure activity. In the settings pane select more:

vnikhilanmsft_2-1716872363391.png


In the onelake hub you will find the exact same sql connection string:

vnikhilanmsft_3-1716872526909.png


Select it and give the credentials. You can find the desired lakehouse and the stored procedure in the dropdown:

vnikhilanmsft_4-1716872706256.png

 

vnikhilanmsft_5-1716872728412.png


In this way you can execute the stored procedure present in Lakehouse.

2) Using notebooks you can create tables only in the dbo schema. You cannot create new tables inside another schema using Notebooks. 
For now you can follow this work around:

Create the tables in dbo schema and move them into desired schema after creation.
I have a table named customers_1000 in dbo. Using the below code you can move it into schema named MAIN.

vnikhilanmsft_6-1716875995093.png
vnikhilanmsft_7-1716876016025.png

 

By following these steps you can move the tables into desired schema.
Hope this helps. Please let me know if you have any further queries.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.