Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Kesahli
Frequent Visitor

Size limit on parameter value in Data Pipeline using copy data assistant?

I've used the Copy Assistant in a data pipeline to set up the ingestion of app 360 tables from one lake house to another (source lakehouse is due for decommission), but the pipeline always only loads the first 151 tables. After a bit of digging, I found that the Default value of the pipeline parameter (that gets created by the Copy assistant) is getting truncated at around 1,048,000 characters. The dafault value is basically a source/sink pairing at the column level for each table. I can get around this by setting up multiple pipelines, the next one starting from the end of the prvious - each one always ends up with around the same number of characters in the parameter value (1,048,006 and 1,046,979)

Is this a known limitation? Is there a better way to achieve this load (it will be a one off, but curious to see other options). 
pipeline_parameter_limit.png

 

1 ACCEPTED SOLUTION
bdarbo78
Advocate II
Advocate II

Hi, I ran into the exact same issue...and was hoping to find the solution here.

I ended up creating an array variable with a 'set variable' action and pasted the json array as its value. There does not seem to be the same limit on the variable length as with pipeline parameters. Then I used the variable in place of the initial parameter.

View solution in original post

3 REPLIES 3
bdarbo78
Advocate II
Advocate II

Hi, I ran into the exact same issue...and was hoping to find the solution here.

I ended up creating an array variable with a 'set variable' action and pasted the json array as its value. There does not seem to be the same limit on the variable length as with pipeline parameters. Then I used the variable in place of the initial parameter.

That sounds like a neat solution. I'll give it a crack and see how it goes. Thanks

Anonymous
Not applicable

Hi @Kesahli ,

 

I have not seen specific documentation on this limitation.

 

Thank you for sharing your solution.

 

There is no other better solution, and your solution is sufficient for your needs.

 

If you want to import a large number of tables faster and better, you need to upgrade the capacity, but this is a paid action and you have realized the need so far so I don't think it's necessary.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.