The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredAsk the Fabric Databases & App Development teams anything! Live on Reddit on August 26th. Learn more.
Newbie here!
I would like to establish an action that I couldn't find a way or answer.
The purpose is to load a list of specific table names from one source of data.
This list is subject to change (new/modify/delete values).
Example, list of tables:
What is the best approch?
Regards,
Théo
Solved! Go to Solution.
Hello @Theo_MS
To dynamically load a list of tables in Microsoft Fabric pipelines with changing table names, use a metadata-driven approach with Lookup and ForEach activities
1 Retrieve Table Names with a Lookup Activity
Use a query or stored procedure to fetch the latest table list from your source system:
Store your table names in a configuration table for easier updates
Ensure the Lookup returns results as an array (e.g., `@activity('Lookup1').output.value`)
Iterate with ForEach Activity
Configure the ForEach activity to process each table
Iterate with ForEach Activity
Configure the ForEach activity to process each table
"items": "@activity('Lookup1').output.value"
1. Lookup Activity (Get Table List)
2. ForEach Activity (Iterate Tables)
If Activity (Filter Table)
Copy Data Activity (Valid Tables)
Optional: Logging Activity (Skipped Tables)
https://github.com/Azure/AnalyticsinaBox/tree/main/Fabric/src/deployment/Pattern1and2
if this is helpful please accept the answer
There is another way. You can create a parameter of type array, and enter your list into the default value. Then you use the parameter in the foreach activity and copy data activity. When in the copy data activity you use a query and can keep it simple by using @Item(), or you could write a query. I generally write a query to add columns to every table in the list. The destination will also have to be configured with @Item()
Create Parameter:
Configure foreach activity:
Configure copy data activity:
Copy data destination configuration:
Hello @Theo_MS
To dynamically load a list of tables in Microsoft Fabric pipelines with changing table names, use a metadata-driven approach with Lookup and ForEach activities
1 Retrieve Table Names with a Lookup Activity
Use a query or stored procedure to fetch the latest table list from your source system:
Store your table names in a configuration table for easier updates
Ensure the Lookup returns results as an array (e.g., `@activity('Lookup1').output.value`)
Iterate with ForEach Activity
Configure the ForEach activity to process each table
Iterate with ForEach Activity
Configure the ForEach activity to process each table
"items": "@activity('Lookup1').output.value"
1. Lookup Activity (Get Table List)
2. ForEach Activity (Iterate Tables)
If Activity (Filter Table)
Copy Data Activity (Valid Tables)
Optional: Logging Activity (Skipped Tables)
https://github.com/Azure/AnalyticsinaBox/tree/main/Fabric/src/deployment/Pattern1and2
if this is helpful please accept the answer
Hi @nilendraFabric,
thanks for you explanation and steps.
Currently in pipeline, it seems impossible to use a stored procedure created in a Lakehouse.
The "stored procedure" activity accepts only Warehouse.
How can I store table names in a configuration in a Lakehouse?
Do you have different proposal ?
Regards,
Théo
Add a Lookup activity to your pipeline:
• Source: Lakehouse table `config_table`.
• Configure to return all rows (not just the first)