Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
How can I deploy a large ADF export over 50 MB to another environment without hitting file size limits, and if linked templates must be hosted in storage, how can I make the deployment work with entra instead of a SAS token?
Solved! Go to Solution.
Hi @bguragain ,
If your ADF export is too large to import directly (anything over a few MB will usually hit the ARM template limits), the supported approach is to use linked templates. When you export your factory, you’ll notice ADF generates a master template plus child templates the idea is that you host those child templates somewhere and deploy via the master.
Most examples show using a storage account + SAS token, but if you don’t want to manage SAS tokens, you don’t have to. A good alternative is to package everything into an Azure Template Spec. Template Specs let you store ARM templates natively in Azure, version them, and control access with Entra RBAC instead of generating tokens. Then in your deployment pipeline (CLI, PowerShell, or DevOps), you just point to the Template Spec resource ID and deploy it.
Follow the bellow official document this documents will help in the related issue.
Using linked Resource Manager templates - Azure Data Factory | Microsoft Learn
Best practices for templates - Azure Resource Manager | Microsoft Learn
Create & deploy template specs - Azure Resource Manager | Microsoft Learn
Create a template spec with linked templates - Azure Resource Manager | Microsoft Learn
Regards,
Akhil.
Hi @bguragain ,
Just checking in to see if the approach of using linked templates (or Template Specs instead of storage + SAS) helped you move past the deployment size limit issue. Were you able to try it out with your factory export?
If you ran into any blockers while setting it up, feel free to share the community can help troubleshoot further.
Regards,
Akhil
1.Use linked (modular) templates: Split pipelines/datasets into smaller templates.
2.Host linked templates in Azure Storage:
3.Deploy with Azure CLI / PowerShell using Azure AD (Entra) authentication:
az deployment group create \
--name MyADFDeployment \
--resource-group MyRG \
--template-uri https://<storage-account>.blob.core.windows.net/templates/mainTemplate.json \
--parameters @parameters.json
4.Alternative: Use ADF Git integration + DevOps pipelines to avoid template export size limits entirely.
Hi @bguragain ,
If your ADF export is too large to import directly (anything over a few MB will usually hit the ARM template limits), the supported approach is to use linked templates. When you export your factory, you’ll notice ADF generates a master template plus child templates the idea is that you host those child templates somewhere and deploy via the master.
Most examples show using a storage account + SAS token, but if you don’t want to manage SAS tokens, you don’t have to. A good alternative is to package everything into an Azure Template Spec. Template Specs let you store ARM templates natively in Azure, version them, and control access with Entra RBAC instead of generating tokens. Then in your deployment pipeline (CLI, PowerShell, or DevOps), you just point to the Template Spec resource ID and deploy it.
Follow the bellow official document this documents will help in the related issue.
Using linked Resource Manager templates - Azure Data Factory | Microsoft Learn
Best practices for templates - Azure Resource Manager | Microsoft Learn
Create & deploy template specs - Azure Resource Manager | Microsoft Learn
Create a template spec with linked templates - Azure Resource Manager | Microsoft Learn
Regards,
Akhil.