Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Newbie here!
I would like to establish an action that I couldn't find a way or answer.
The purpose is to load a list of specific table names from one source of data.
This list is subject to change (new/modify/delete values).
Example, list of tables:
What is the best approch?
Regards,
Théo
Solved! Go to Solution.
Hello @Theo_MS
To dynamically load a list of tables in Microsoft Fabric pipelines with changing table names, use a metadata-driven approach with Lookup and ForEach activities
1 Retrieve Table Names with a Lookup Activity
Use a query or stored procedure to fetch the latest table list from your source system:
Store your table names in a configuration table for easier updates
Ensure the Lookup returns results as an array (e.g., `@activity('Lookup1').output.value`)
Iterate with ForEach Activity
Configure the ForEach activity to process each table
Iterate with ForEach Activity
Configure the ForEach activity to process each table
"items": "@activity('Lookup1').output.value"
1. Lookup Activity (Get Table List)
2. ForEach Activity (Iterate Tables)
If Activity (Filter Table)
Copy Data Activity (Valid Tables)
Optional: Logging Activity (Skipped Tables)
https://github.com/Azure/AnalyticsinaBox/tree/main/Fabric/src/deployment/Pattern1and2
if this is helpful please accept the answer
There is another way. You can create a parameter of type array, and enter your list into the default value. Then you use the parameter in the foreach activity and copy data activity. When in the copy data activity you use a query and can keep it simple by using @Item(), or you could write a query. I generally write a query to add columns to every table in the list. The destination will also have to be configured with @Item()
Create Parameter:
Configure foreach activity:
Configure copy data activity:
Copy data destination configuration:
Hello @Theo_MS
To dynamically load a list of tables in Microsoft Fabric pipelines with changing table names, use a metadata-driven approach with Lookup and ForEach activities
1 Retrieve Table Names with a Lookup Activity
Use a query or stored procedure to fetch the latest table list from your source system:
Store your table names in a configuration table for easier updates
Ensure the Lookup returns results as an array (e.g., `@activity('Lookup1').output.value`)
Iterate with ForEach Activity
Configure the ForEach activity to process each table
Iterate with ForEach Activity
Configure the ForEach activity to process each table
"items": "@activity('Lookup1').output.value"
1. Lookup Activity (Get Table List)
2. ForEach Activity (Iterate Tables)
If Activity (Filter Table)
Copy Data Activity (Valid Tables)
Optional: Logging Activity (Skipped Tables)
https://github.com/Azure/AnalyticsinaBox/tree/main/Fabric/src/deployment/Pattern1and2
if this is helpful please accept the answer
Hi @nilendraFabric,
thanks for you explanation and steps.
Currently in pipeline, it seems impossible to use a stored procedure created in a Lakehouse.
The "stored procedure" activity accepts only Warehouse.
How can I store table names in a configuration in a Lakehouse?
Do you have different proposal ?
Regards,
Théo
Add a Lookup activity to your pipeline:
• Source: Lakehouse table `config_table`.
• Configure to return all rows (not just the first)