Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by watching the DP-600 session on-demand now through April 28th.
Learn moreJoin the FabCon + SQLCon recap series. Up next: Power BI, Real-Time Intelligence, IQ and AI, and Data Factory take center stage. All sessions are available on-demand after the live show. Register now
When using Dynamic Content for the Dataflow ID in the Dataflow Activity, my pipelines were failing and the error description wasn't clear on why. The Dataflows were the newer Dataflow Gen2 (CI/CD) type.
Viewing the JSON for the pipeline, I noticed that the following key/value was omitted when changing from a picked dataflow to a Dynamic Content one: "dataflowType": "DataflowFabric"
Here's the full JSON extract for the dataflow with the key added in. With the key added, the pipeline runs successfully, without it, it fails.
I couldn't find a way to report this, hence the post.
{
"name": "Update Dataflow",
"type": "RefreshDataflow",
"dependsOn": [],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"typeProperties": {
"dataflowId": {
"value": "@item()['dataflow_id']",
"type": "Expression"
},
"workspaceId": "xxxx",
"notifyOption": "NoNotification",
"dataflowType": "DataflowFabric",
"parameters": {
"Report": {
"value": {
"value": "@item()['report_name']",
"type": "Expression"
},
"type": "String"
},
"AppendReplace": {
"value": "Replace",
"type": "String"
}
}
}
}
Solved! Go to Solution.
Hi @russellhq ,
Thank you for reaching out to the Microsoft Community Forum.
When you manually select a Dataflow in the “Refresh Dataflow” activity, Fabric automatically adds "dataflowType": "DataflowFabric" property internally. But when you switch that same field to Dynamic Content like
@item()['dataflow_id'] the property will exclude entirely from the pipeline JSON.
Without "dataflowType": "DataflowFabric", the runtime defaults to the legacy Dataflow (Gen1) type which cannot resolve or run Gen2 dataflows, leading to error like “The dataflow ID could not be found” or “Failed to refresh dataflow, invalid ID.”
Solution: Add that key manually in the JSON or if you are dynamically building the pipeline via API/CI-CD, include it explicitly in your definition.
Please refer below updated code.
"typeProperties": {
"dataflowId": {
"value": "@item()['dataflow_id']",
"type": "Expression"
},
"workspaceId": "xxxx",
"dataflowType": "DataflowFabric",
"notifyOption": "NoNotification"
}
Please try below alternative workaround options.
1. Manually edit JSON, Export the pipeline JSON . Add "dataflowType": "DataflowFabric" under each Dataflow activity. And then Import back.
2. Use the REST API / YAML definition (CI/CD), Always include "dataflowType": "DataflowFabric" in your source definition.
3. You can dynamically pass dataflowId and still keep the static "dataflowType": "DataflowFabric". Please refer below sample code.
"dataflowId": {
"value": "@pipeline().parameters.DataflowId",
"type": "Expression"
},
"dataflowType": "DataflowFabric"
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
Hi @russellhq ,
Thank you for reaching out to the Microsoft Community Forum.
When you manually select a Dataflow in the “Refresh Dataflow” activity, Fabric automatically adds "dataflowType": "DataflowFabric" property internally. But when you switch that same field to Dynamic Content like
@item()['dataflow_id'] the property will exclude entirely from the pipeline JSON.
Without "dataflowType": "DataflowFabric", the runtime defaults to the legacy Dataflow (Gen1) type which cannot resolve or run Gen2 dataflows, leading to error like “The dataflow ID could not be found” or “Failed to refresh dataflow, invalid ID.”
Solution: Add that key manually in the JSON or if you are dynamically building the pipeline via API/CI-CD, include it explicitly in your definition.
Please refer below updated code.
"typeProperties": {
"dataflowId": {
"value": "@item()['dataflow_id']",
"type": "Expression"
},
"workspaceId": "xxxx",
"dataflowType": "DataflowFabric",
"notifyOption": "NoNotification"
}
Please try below alternative workaround options.
1. Manually edit JSON, Export the pipeline JSON . Add "dataflowType": "DataflowFabric" under each Dataflow activity. And then Import back.
2. Use the REST API / YAML definition (CI/CD), Always include "dataflowType": "DataflowFabric" in your source definition.
3. You can dynamically pass dataflowId and still keep the static "dataflowType": "DataflowFabric". Please refer below sample code.
"dataflowId": {
"value": "@pipeline().parameters.DataflowId",
"type": "Expression"
},
"dataflowType": "DataflowFabric"
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
| User | Count |
|---|---|
| 3 | |
| 3 | |
| 1 | |
| 1 | |
| 1 |
| User | Count |
|---|---|
| 8 | |
| 3 | |
| 2 | |
| 2 | |
| 2 |