Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Is there any way to code the pipeline activities rather than using the graphical interface at all.
Such as
//pseudocode in javascript where fnCopyActivity and fnNotebbok are in-built function calls
//representative of Copy Activity and Notebook execution available in the graphical interface
//set variable
const setVariable1 = 'prod_db'
//define fnCopyActivity parameters
const copyParameters = [{
source: {
connection: 'on_prem_sql',
query: 'select * from `{setVariable1}`'
}
},
{
destination: {
connection: 'lakehouse',
files: 'Files/daily'
}
}
]
//execute copy activity
const copyActivity = fnCopyActivity(copyParameters)
//take nly output ids
const copyOutputArray = copyActivity.map(a => a.output.id)
//execute forEach
copyOutputArray.forEach(a => {
fnNotebbok(NB1(src=a))
})
With promise chaining
// Execute copy activity
fnCopyActivity(copyParameters)
.then((copyActivity) => {
// If successful, extract the output ids
const copyOutputArray = copyActivity.map(a => a.output.id);
// Return the output array for the next .then() in the chain
return copyOutputArray;
})
.then((copyOutputArray) => {
// If successful, execute fnNotebbok for each id
return Promise.all(
copyOutputArray.map(id => fnNotebbok(NB1(src=a))
);
})
.then((notebookResults) => {
// All fnNotebbok executions were successful
console.log('All notebooks executed successfully', notebookResults);
})
.catch((error) => {
// Handle any errors that occur at any stage of the chain
console.error('An error occurred:', error);
});
Thank you in advance
Solved! Go to Solution.
"But there is no way you can manually create a json from scratch that would represents the json otherise created by using the UI."
I think you can: https://learn.microsoft.com/en-us/fabric/data-factory/pipeline-rest-api
Hi @smpa01 ,
In addition to the ideas and comments mentioned by the two above, I think you can also try the following:
1. You can create and manage data factory pipelines programmatically using SDKs for .NET, Python, and other languages.
2. You can also use the Azure CLI and PowerShell, command-line tools that allow you to manage Data Factory resources through scripts, including creating and updating pipelines.
Read this official documentation below for more information: Use custom activities in a pipeline - Azure Data Factory & Azure Synapse | Microsoft Learn
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
https://learn.microsoft.com/en-us/azure/data-factory/quickstart-create-data-factory-powershell seems like it would work, too. But the biggest problem is that you need to do a fair amount of JSON, and then essentially use PoSH to import the configs.
Pipeline triggers proxy as opposed to redirect. In other words, pipeline is more expensive than notebook. If the cost were equal, it was worth the effort.
Hi @smpa01 ,
In addition to the ideas and comments mentioned by the two above, I think you can also try the following:
1. You can create and manage data factory pipelines programmatically using SDKs for .NET, Python, and other languages.
2. You can also use the Azure CLI and PowerShell, command-line tools that allow you to manage Data Factory resources through scripts, including creating and updating pipelines.
Read this official documentation below for more information: Use custom activities in a pipeline - Azure Data Factory & Azure Synapse | Microsoft Learn
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Data pipelines in the backend are basically json formats.
You can code the pipelines rather than ui drag /drop in case if you are familiar with the data pipeline json format
How can we do it in Fabric. Is this how you would achieve it? (Link below)
https://www.red-gate.com/simple-talk/blogs/edit-fabric-pipeline-json/
When you crate a pipeline it generates a json, you can edit the json. But there is no way you can manually create a json from scratch that would represents the json otherise created by using the UI.
In a perfect fabric world, I would code to create the pipeline which derives the json and not the other way round.
"But there is no way you can manually create a json from scratch that would represents the json otherise created by using the UI."
I think you can: https://learn.microsoft.com/en-us/fabric/data-factory/pipeline-rest-api
Thanks.
Yes, you can feed a json that would create a pipeline. But it is harder to manually create a json that would have all the correct info that is conducive to the desired pipeline.
What I am looking for is a progrmatic way to create pipeline through some python/powershell libraries that already has all the pre-defined pipeline methods. So that I can invoke the methods with correct parameters and promise chain (on success, fail etc) and be able to schedle executing the code.
By programming it, I would then create the pipeline and the correct json. You can edit the pipeine from the code itself.
Otherwise, there is no way to create a json manually with the exact same pipeline in your head.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.