Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric certified for FREE! Don't miss your chance! Learn more

Reply
Dinesh_K
New Member

How to Parameterize Lakehouse Name in Warehouse Stored Procedure in fabric?

I have a stored procedure in a Fabric Warehouse that queries a Lakehouse table. It works fine, but I need to parameterize the Lakehouse name so the procedure can run across different environments where Lakehouse names vary.

Options I’ve tried:

  1. Dynamic SQL – Works for querying, but doesn’t allow pushing results to a temp table and becomes complex for large queries.
  2. Warehouse Views – Works, but requires creating a separate view for each Lakehouse table.

Is there any other approach to make the Lakehouse name dynamic when moving between environments?

2 ACCEPTED SOLUTIONS
apturlov
Responsive Resident
Responsive Resident

Hello @Dinesh_K, thank you for your question which is a very relevant and real-world. The options you mentioned are the only ones I could think of based on the research I did.

 

In my scenario, I use a Lakehouse as a silver tier data store from which I want to load data regularly into a warehouse's dimensional schema. So, I defined a stored procedure that contains SQL code that loads data from Lakehouse tables into Warehouse table. In SQL I have to use a 3-part identifierd for the Lakehouse objects [lakehouse_name].[schema_name].[table_name]. As you correctly described, if I want to move my warehouse into another workspace, say a different environment, I will likely need to point the stored procedures to a different Lakehouse. Currently, there is no way to parameterize a Lakehouse object name in the SQL code and T-SQL Synonyms are not supported in Fabric Warehouse. My choice at this point would be to define views in the warehouse for each Lakehouse table (or combination of tables if the view is complex) and use the views in my Warehouse stored procedures SQL code instead of firectly pointing to the Lakehouse tables.

 

Next, I would address how I can automate a deployment process for the warehouse. Most likely I will be using a Database project of the Warehouse that I have in my Git repository that I will deploy using a DevOps pipeline in either Azure DevOps or GitHub. Before the deployment I can run a script that will go through all views SQL definitions and replace the Lakehouse name to the target one based on the known destination environment. Such a script can be even dynamically generated based on the current set of views.

 

That's how I would approach this scenario. I would not use dynamic SQL as it is more complicated to develop and maintain and error prone compared to views.

 

Hope, this answer helps you to choose the right path in your scenario.

 

If you find this answer helpful please give it kudos or mark as a solution for your question.

View solution in original post

apturlov
Responsive Resident
Responsive Resident

@Dinesh_K here is another idea how to parameterize a Laekhouse table name. It's not directly related to a warehouse but it depends on a scenario. As I described above, I used a stored procedure in a warehouse to load data from a Lakehouse. To automate that process I also had a data pipeline with a stored procedure activity. But I have a choice here. I can also use a script activity and have a SQL script that loads data directly in the activity instead of calling a stored procedure. In that case I can easily parameterize the Lakehouse name in my script using a pipeline dynamic expression and define my lakehouse name as a pipeline parameter that I can replace when executing a pipeline. I could also use a Variables library and define a Lakehouse name variable for multiple environments the will be automatically replaced if I use Fabric deployment pipelines.

This is another idea, but which path to take strongly depends on your scenario.

 

If you find this answer helpful please give it kudos and/or mark as a solution for your question.

View solution in original post

5 REPLIES 5
v-sshirivolu
Community Support
Community Support

Hi @Dinesh_K ,

I would also take a moment to thank @apturlov , for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference.
 

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions

 

Hi @Dinesh_K ,
hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you

 

Hi @Dinesh_K ,
hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you

 

apturlov
Responsive Resident
Responsive Resident

@Dinesh_K here is another idea how to parameterize a Laekhouse table name. It's not directly related to a warehouse but it depends on a scenario. As I described above, I used a stored procedure in a warehouse to load data from a Lakehouse. To automate that process I also had a data pipeline with a stored procedure activity. But I have a choice here. I can also use a script activity and have a SQL script that loads data directly in the activity instead of calling a stored procedure. In that case I can easily parameterize the Lakehouse name in my script using a pipeline dynamic expression and define my lakehouse name as a pipeline parameter that I can replace when executing a pipeline. I could also use a Variables library and define a Lakehouse name variable for multiple environments the will be automatically replaced if I use Fabric deployment pipelines.

This is another idea, but which path to take strongly depends on your scenario.

 

If you find this answer helpful please give it kudos and/or mark as a solution for your question.

apturlov
Responsive Resident
Responsive Resident

Hello @Dinesh_K, thank you for your question which is a very relevant and real-world. The options you mentioned are the only ones I could think of based on the research I did.

 

In my scenario, I use a Lakehouse as a silver tier data store from which I want to load data regularly into a warehouse's dimensional schema. So, I defined a stored procedure that contains SQL code that loads data from Lakehouse tables into Warehouse table. In SQL I have to use a 3-part identifierd for the Lakehouse objects [lakehouse_name].[schema_name].[table_name]. As you correctly described, if I want to move my warehouse into another workspace, say a different environment, I will likely need to point the stored procedures to a different Lakehouse. Currently, there is no way to parameterize a Lakehouse object name in the SQL code and T-SQL Synonyms are not supported in Fabric Warehouse. My choice at this point would be to define views in the warehouse for each Lakehouse table (or combination of tables if the view is complex) and use the views in my Warehouse stored procedures SQL code instead of firectly pointing to the Lakehouse tables.

 

Next, I would address how I can automate a deployment process for the warehouse. Most likely I will be using a Database project of the Warehouse that I have in my Git repository that I will deploy using a DevOps pipeline in either Azure DevOps or GitHub. Before the deployment I can run a script that will go through all views SQL definitions and replace the Lakehouse name to the target one based on the known destination environment. Such a script can be even dynamically generated based on the current set of views.

 

That's how I would approach this scenario. I would not use dynamic SQL as it is more complicated to develop and maintain and error prone compared to views.

 

Hope, this answer helps you to choose the right path in your scenario.

 

If you find this answer helpful please give it kudos or mark as a solution for your question.

Helpful resources

Announcements
Sticker Challenge 2026 Carousel

Join our Community Sticker Challenge 2026

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!

Free Fabric Certifications

Free Fabric Certifications

Get Fabric certified for free! Don't miss your chance.

January Fabric Update Carousel

Fabric Monthly Update - January 2026

Check out the January 2026 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.