Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified as a Fabric Data Engineer: Check your eligibility for a 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700. Get started

Reply
AmirHeidari
Frequent Visitor

Creating schemas in lakehouse through pipeline + creating tables inside schemas

Hi,

Our application serves different tenants and I was looking for a way to be able to isolate each tenant data in our lakeHouse.

Previously in azure synapse analytics, I was able to create a lake database per each tenant and I'm looking for a similar solution in fabric.

My question is 2 fold:

  1. Is it possible to create a schema through pipelines/notebooks in fabric?
    I have created a stored procdure in sql end point in dbo schema, which creates a new schemaAmirHeidari_0-1716824091821.png

    I can't seem to find a way to execute this through pipeline as the script activity or stored procedure seem to only execute within external azure DB or a fabric warehouse but not in the lakehouse. Has anyone found a way to execute such scripts against the lakehouse.

  2. Even if I execute above script manually to create the schema, I cannot use notebooks to create delta tables inside that schema.
    For example I have creatd a schema called fd7311e6. 
    AmirHeidari_1-1716824346682.png

    If I want to create a delta table within that schema, I get the following error:

     

    AmirHeidari_2-1716824376010.png

    Of course, so far I have tried to create tables and distinguish them by the naming convention. But what I ultimately need to do is to move my analytics table into their own schema and then serving them to our BI.

    AmirHeidari_3-1716824644177.png

     

    Thanks 

1 ACCEPTED SOLUTION
v-nikhilan-msft
Community Support
Community Support

Hi @AmirHeidari 
Thanks for using Fabric Community.
1) You can connect to the Lakehouse using the SQL endpoint connection string by following the below steps:

vnikhilanmsft_0-1716872245579.png


Copy the sql connection string:

vnikhilanmsft_1-1716872288247.png


Create a new pipeline and select the stored procedure activity. In the settings pane select more:

vnikhilanmsft_2-1716872363391.png


In the onelake hub you will find the exact same sql connection string:

vnikhilanmsft_3-1716872526909.png


Select it and give the credentials. You can find the desired lakehouse and the stored procedure in the dropdown:

vnikhilanmsft_4-1716872706256.png

 

vnikhilanmsft_5-1716872728412.png


In this way you can execute the stored procedure present in Lakehouse.

2) Using notebooks you can create tables only in the dbo schema. You cannot create new tables inside another schema using Notebooks. 
For now you can follow this work around:

Create the tables in dbo schema and move them into desired schema after creation.
I have a table named customers_1000 in dbo. Using the below code you can move it into schema named MAIN.

vnikhilanmsft_6-1716875995093.png
vnikhilanmsft_7-1716876016025.png

 

By following these steps you can move the tables into desired schema.
Hope this helps. Please let me know if you have any further queries.

View solution in original post

3 REPLIES 3
AmirHeidari
Frequent Visitor

Thanks
I need to figure out some access related issues related to service principal authentictation but anyhow schema transfer works in my case.
Actually I remember I had a similar case in synaspe as well that I need to run some scripts in serverless sql using pipelines. I ended up also creating serverless sql as a linked service and it worked. So this solution is pretty similar to that one as well

Hi @AmirHeidari 
Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.

v-nikhilan-msft
Community Support
Community Support

Hi @AmirHeidari 
Thanks for using Fabric Community.
1) You can connect to the Lakehouse using the SQL endpoint connection string by following the below steps:

vnikhilanmsft_0-1716872245579.png


Copy the sql connection string:

vnikhilanmsft_1-1716872288247.png


Create a new pipeline and select the stored procedure activity. In the settings pane select more:

vnikhilanmsft_2-1716872363391.png


In the onelake hub you will find the exact same sql connection string:

vnikhilanmsft_3-1716872526909.png


Select it and give the credentials. You can find the desired lakehouse and the stored procedure in the dropdown:

vnikhilanmsft_4-1716872706256.png

 

vnikhilanmsft_5-1716872728412.png


In this way you can execute the stored procedure present in Lakehouse.

2) Using notebooks you can create tables only in the dbo schema. You cannot create new tables inside another schema using Notebooks. 
For now you can follow this work around:

Create the tables in dbo schema and move them into desired schema after creation.
I have a table named customers_1000 in dbo. Using the below code you can move it into schema named MAIN.

vnikhilanmsft_6-1716875995093.png
vnikhilanmsft_7-1716876016025.png

 

By following these steps you can move the tables into desired schema.
Hope this helps. Please let me know if you have any further queries.

Helpful resources

Announcements
Feb2025 Sticker Challenge

Join our Community Sticker Challenge 2025

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!

JanFabricDE_carousel

Fabric Monthly Update - January 2025

Explore the power of Python Notebooks in Fabric!

JanFabricDW_carousel

Fabric Monthly Update - January 2025

Unlock the latest Fabric Data Warehouse upgrades!