Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount.
Register nowGet certified as a Fabric Data Engineer: Check your eligibility for a 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700. Get started
Hi,
Our application serves different tenants and I was looking for a way to be able to isolate each tenant data in our lakeHouse.
Previously in azure synapse analytics, I was able to create a lake database per each tenant and I'm looking for a similar solution in fabric.
My question is 2 fold:
I can't seem to find a way to execute this through pipeline as the script activity or stored procedure seem to only execute within external azure DB or a fabric warehouse but not in the lakehouse. Has anyone found a way to execute such scripts against the lakehouse.
If I want to create a delta table within that schema, I get the following error:
Of course, so far I have tried to create tables and distinguish them by the naming convention. But what I ultimately need to do is to move my analytics table into their own schema and then serving them to our BI.
Thanks
Solved! Go to Solution.
Hi @AmirHeidari
Thanks for using Fabric Community.
1) You can connect to the Lakehouse using the SQL endpoint connection string by following the below steps:
Copy the sql connection string:
Create a new pipeline and select the stored procedure activity. In the settings pane select more:
In the onelake hub you will find the exact same sql connection string:
Select it and give the credentials. You can find the desired lakehouse and the stored procedure in the dropdown:
In this way you can execute the stored procedure present in Lakehouse.
2) Using notebooks you can create tables only in the dbo schema. You cannot create new tables inside another schema using Notebooks.
For now you can follow this work around:
Create the tables in dbo schema and move them into desired schema after creation.
I have a table named customers_1000 in dbo. Using the below code you can move it into schema named MAIN.
By following these steps you can move the tables into desired schema.
Hope this helps. Please let me know if you have any further queries.
Thanks
I need to figure out some access related issues related to service principal authentictation but anyhow schema transfer works in my case.
Actually I remember I had a similar case in synaspe as well that I need to run some scripts in serverless sql using pipelines. I ended up also creating serverless sql as a linked service and it worked. So this solution is pretty similar to that one as well
Hi @AmirHeidari
Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.
Hi @AmirHeidari
Thanks for using Fabric Community.
1) You can connect to the Lakehouse using the SQL endpoint connection string by following the below steps:
Copy the sql connection string:
Create a new pipeline and select the stored procedure activity. In the settings pane select more:
In the onelake hub you will find the exact same sql connection string:
Select it and give the credentials. You can find the desired lakehouse and the stored procedure in the dropdown:
In this way you can execute the stored procedure present in Lakehouse.
2) Using notebooks you can create tables only in the dbo schema. You cannot create new tables inside another schema using Notebooks.
For now you can follow this work around:
Create the tables in dbo schema and move them into desired schema after creation.
I have a table named customers_1000 in dbo. Using the below code you can move it into schema named MAIN.
By following these steps you can move the tables into desired schema.
Hope this helps. Please let me know if you have any further queries.