Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
Our application serves different tenants and I was looking for a way to be able to isolate each tenant data in our lakeHouse.
Previously in azure synapse analytics, I was able to create a lake database per each tenant and I'm looking for a similar solution in fabric.
My question is 2 fold:
I can't seem to find a way to execute this through pipeline as the script activity or stored procedure seem to only execute within external azure DB or a fabric warehouse but not in the lakehouse. Has anyone found a way to execute such scripts against the lakehouse.
If I want to create a delta table within that schema, I get the following error:
Of course, so far I have tried to create tables and distinguish them by the naming convention. But what I ultimately need to do is to move my analytics table into their own schema and then serving them to our BI.
Thanks
Solved! Go to Solution.
Hi @Anonymous
Thanks for using Fabric Community.
1) You can connect to the Lakehouse using the SQL endpoint connection string by following the below steps:
Copy the sql connection string:
Create a new pipeline and select the stored procedure activity. In the settings pane select more:
In the onelake hub you will find the exact same sql connection string:
Select it and give the credentials. You can find the desired lakehouse and the stored procedure in the dropdown:
In this way you can execute the stored procedure present in Lakehouse.
2) Using notebooks you can create tables only in the dbo schema. You cannot create new tables inside another schema using Notebooks.
For now you can follow this work around:
Create the tables in dbo schema and move them into desired schema after creation.
I have a table named customers_1000 in dbo. Using the below code you can move it into schema named MAIN.
By following these steps you can move the tables into desired schema.
Hope this helps. Please let me know if you have any further queries.
Thanks
I need to figure out some access related issues related to service principal authentictation but anyhow schema transfer works in my case.
Actually I remember I had a similar case in synaspe as well that I need to run some scripts in serverless sql using pipelines. I ended up also creating serverless sql as a linked service and it worked. So this solution is pretty similar to that one as well
Hi @Anonymous
Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.
Hi @Anonymous
Thanks for using Fabric Community.
1) You can connect to the Lakehouse using the SQL endpoint connection string by following the below steps:
Copy the sql connection string:
Create a new pipeline and select the stored procedure activity. In the settings pane select more:
In the onelake hub you will find the exact same sql connection string:
Select it and give the credentials. You can find the desired lakehouse and the stored procedure in the dropdown:
In this way you can execute the stored procedure present in Lakehouse.
2) Using notebooks you can create tables only in the dbo schema. You cannot create new tables inside another schema using Notebooks.
For now you can follow this work around:
Create the tables in dbo schema and move them into desired schema after creation.
I have a table named customers_1000 in dbo. Using the below code you can move it into schema named MAIN.
By following these steps you can move the tables into desired schema.
Hope this helps. Please let me know if you have any further queries.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.