Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hello!
When setting up a if condition nested inside of a for loop, in a data pipeline, I am able to refrence the "@item" that the for loop is referencing in the if condition expression. The pipeline validation works. However, when attempting to use the "@item" in an dataflow activity it no longer seems to work, and loses the context where it is.
1) Attempting to run the pipeline results in the following error message: "ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression 'item.pid' "
2) clicking the "Add Dynamic content" link for the parameter only shows what the input is for the for loop 1 level up.
Thanks for any help
Solved! Go to Solution.
This issue occurs in Fabric pipelines, which follow the Azure Data Factory model, when using @Item() inside a ForEach loop nested within a Dataflow activity. At runtime, @Item() becomes out of scope within the Dataflow.
This happens because Dataflows change expression context, preventing direct recognition of @Item().
Create a parameter in the Dataflow activity (pipeline side), then explicitly pass the @Item() value into that parameter.
HTH!
Thanks for the assitance @libpekin !
This is a bit tricky, I do have a solution built for this, that works using variables as you suggest, but am running it sequentially. It appears that there should be a way (at least from this ADF thread: Running a ForEach activity in parallel mode - Microsoft Q&A ) that you should be able to set a variable local to the the for loop, but this does not appear to be availble in datapipelines.
Documentation states the workaround to this limitation is to nest datapipelines and use invoke data pipeline, but this is in preview.
Have you attempted parrallel execution using this method?
Hi @anon97242 ,
Thank you for engaging with the Fabric community. @libpekin , explanation is absolutely correct.
As mentioned, @item() goes out of scope within a Dataflow activity due to a context switch at runtime. Passing values through Dataflow parameters is the recommended and supported approach in such scenarios.
For reference, here are some official Microsoft documents that explain this behavior
ForEach activity - Azure Data Factory & Azure Synapse | Microsoft Learn
Parameterizing mapping data flows - Azure Data Factory & Azure Synapse | Microsoft Learn
Thanks for your response @libpekin .
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
This issue occurs in Fabric pipelines, which follow the Azure Data Factory model, when using @Item() inside a ForEach loop nested within a Dataflow activity. At runtime, @Item() becomes out of scope within the Dataflow.
This happens because Dataflows change expression context, preventing direct recognition of @Item().
Create a parameter in the Dataflow activity (pipeline side), then explicitly pass the @Item() value into that parameter.
HTH!
User | Count |
---|---|
13 | |
4 | |
3 | |
3 | |
3 |
User | Count |
---|---|
8 | |
8 | |
7 | |
6 | |
5 |