Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
Kesahli
Frequent Visitor

Size limit on parameter value in Data Pipeline using copy data assistant?

I've used the Copy Assistant in a data pipeline to set up the ingestion of app 360 tables from one lake house to another (source lakehouse is due for decommission), but the pipeline always only loads the first 151 tables. After a bit of digging, I found that the Default value of the pipeline parameter (that gets created by the Copy assistant) is getting truncated at around 1,048,000 characters. The dafault value is basically a source/sink pairing at the column level for each table. I can get around this by setting up multiple pipelines, the next one starting from the end of the prvious - each one always ends up with around the same number of characters in the parameter value (1,048,006 and 1,046,979)

Is this a known limitation? Is there a better way to achieve this load (it will be a one off, but curious to see other options). 
pipeline_parameter_limit.png

 

1 ACCEPTED SOLUTION
bdarbo78
Advocate II
Advocate II

Hi, I ran into the exact same issue...and was hoping to find the solution here.

I ended up creating an array variable with a 'set variable' action and pasted the json array as its value. There does not seem to be the same limit on the variable length as with pipeline parameters. Then I used the variable in place of the initial parameter.

View solution in original post

3 REPLIES 3
bdarbo78
Advocate II
Advocate II

Hi, I ran into the exact same issue...and was hoping to find the solution here.

I ended up creating an array variable with a 'set variable' action and pasted the json array as its value. There does not seem to be the same limit on the variable length as with pipeline parameters. Then I used the variable in place of the initial parameter.

That sounds like a neat solution. I'll give it a crack and see how it goes. Thanks

Anonymous
Not applicable

Hi @Kesahli ,

 

I have not seen specific documentation on this limitation.

 

Thank you for sharing your solution.

 

There is no other better solution, and your solution is sufficient for your needs.

 

If you want to import a large number of tables faster and better, you need to upgrade the capacity, but this is a paid action and you have realized the need so far so I don't think it's necessary.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.