Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
bguragain
Microsoft Employee
Microsoft Employee

ADF Template import fails for being too large

How can I deploy a large ADF export over 50 MB to another environment without hitting file size limits, and if linked templates must be hosted in storage, how can I make the deployment work with entra instead of a SAS token?

1 ACCEPTED SOLUTION
v-agajavelly
Community Support
Community Support

Hi @bguragain ,

If your ADF export is too large to import directly (anything over a few MB will usually hit the ARM template limits), the supported approach is to use linked templates. When you export your factory, you’ll notice ADF generates a master template plus child templates the idea is that you host those child templates somewhere and deploy via the master.

Most examples show using a storage account + SAS token, but if you don’t want to manage SAS tokens, you don’t have to. A good alternative is to package everything into an Azure Template Spec. Template Specs let you store ARM templates natively in Azure, version them, and control access with Entra RBAC instead of generating tokens. Then in your deployment pipeline (CLI, PowerShell, or DevOps), you just point to the Template Spec resource ID and deploy it.

  • Use linked templates to get around the size limit.
  • Instead of storage + SAS, publish them into a Template Spec and secure it with Entra roles.
  • Deploy from the Template Spec resource ID – no SAS required, and it scales for big factories.

Follow the bellow official document this documents will help in the related issue.
Using linked Resource Manager templates - Azure Data Factory | Microsoft Learn
Best practices for templates - Azure Resource Manager | Microsoft Learn
Create & deploy template specs - Azure Resource Manager | Microsoft Learn
Create a template spec with linked templates - Azure Resource Manager | Microsoft Learn

Regards,
Akhil.

View solution in original post

3 REPLIES 3
v-agajavelly
Community Support
Community Support

Hi @bguragain ,

Just checking in to see if the approach of using linked templates (or Template Specs instead of storage + SAS) helped you move past the deployment size limit issue. Were you able to try it out with your factory export?

If you ran into any blockers while setting it up, feel free to share the community can help troubleshoot further.

Regards,
Akhil

Shahid12523
Community Champion
Community Champion

1.Use linked (modular) templates: Split pipelines/datasets into smaller templates.

2.Host linked templates in Azure Storage:

  • Grant your SPN or user Storage Blob Data Reader access via Azure RBAC.
  • Use HTTPS URLs to the blobs (no SAS needed).

3.Deploy with Azure CLI / PowerShell using Azure AD (Entra) authentication:

 

az deployment group create \
--name MyADFDeployment \
--resource-group MyRG \
--template-uri https://<storage-account>.blob.core.windows.net/templates/mainTemplate.json \
--parameters @parameters.json


4.Alternative: Use ADF Git integration + DevOps pipelines to avoid template export size limits entirely.

Shahed Shaikh
v-agajavelly
Community Support
Community Support

Hi @bguragain ,

If your ADF export is too large to import directly (anything over a few MB will usually hit the ARM template limits), the supported approach is to use linked templates. When you export your factory, you’ll notice ADF generates a master template plus child templates the idea is that you host those child templates somewhere and deploy via the master.

Most examples show using a storage account + SAS token, but if you don’t want to manage SAS tokens, you don’t have to. A good alternative is to package everything into an Azure Template Spec. Template Specs let you store ARM templates natively in Azure, version them, and control access with Entra RBAC instead of generating tokens. Then in your deployment pipeline (CLI, PowerShell, or DevOps), you just point to the Template Spec resource ID and deploy it.

  • Use linked templates to get around the size limit.
  • Instead of storage + SAS, publish them into a Template Spec and secure it with Entra roles.
  • Deploy from the Template Spec resource ID – no SAS required, and it scales for big factories.

Follow the bellow official document this documents will help in the related issue.
Using linked Resource Manager templates - Azure Data Factory | Microsoft Learn
Best practices for templates - Azure Resource Manager | Microsoft Learn
Create & deploy template specs - Azure Resource Manager | Microsoft Learn
Create a template spec with linked templates - Azure Resource Manager | Microsoft Learn

Regards,
Akhil.

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors