Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric certified for FREE! Don't miss your chance! Learn more

Reply
BriefStop
Helper I
Helper I

Copy Jobs Failing Because Multiple are Running

I am creating a metadata driven pipeline with a control table that cycles through a pipeline, one data object at a time. Because of the ease of use I have each data object updated through an individual copy job and then cycle through the copy jobs.

 

However when I run the data objects through the for loop, after the first copy job through the for loop is run, the second copy job for another data object runs into an error saying that a copy job was running at the same time so it has to fail.

 

Is there a way to run different copy jobs in parallel? I wanted to use copy jobs over copy activities as the transformation is non intensive and would be more low code friendly than a copy activity in a pipeline. Copy jobs may be better architecturally wise but I need to be able to split up the pipeline by data object instead of a grouping of data objects if I ran a copy jobs by domain.

1 ACCEPTED SOLUTION

When you parameterise a Copy Data activity, you can also make the schema mapping dynamic by using metadata and expressions instead of hardcoding mappings in each pipeline.
 

Store Mappings in a Metadata Table

Create a control table (or JSON file) that contains:
  • Source table name
  • Target table name
  • Column mappings in JSON format, for example:
[
  { "source": "CustomerID", "sink": "Cust_ID" },
  { "source": "CustomerName", "sink": "Cust_Name" }
]

 

Pass Mapping to Copy Activity

In the Copy Data activity, set the translator property dynamically:
 
 
"translator": {
    "type": "TabularTranslator",
    "mappings": "@activity('GetMapping').output.value"
}

 

Here, GetMapping is the Lookup activity that retrieved the JSON mapping.

 

If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster.
Connect with me on LinkedIn

View solution in original post

6 REPLIES 6
Zanqueta
Super User
Super User

Hi @BriefStop 

 

Options to Solve This

1. Switch to Copy Activities in Pipelines

If your goal is parallelism, use Copy Activities inside a pipeline instead of Copy Jobs.
  • You can run multiple Copy Activities in parallel using ForEach with isSequential = false.
  • This gives you full control over concurrency and error handling.
Example:
 
"ForEach": {
    "items": "@activity('GetMetadata').output.value",
    "isSequential": false,
    "activities": [
        {
            "name": "CopyData",
            "type": "Copy",
            ...
        }
    ]

 

If this response was helpful in any way, I’d gladly accept a 👍much like the joy of seeing a DAX measure work first time without needing another FILTER.

Please mark it as the correct solution. It helps other community members find their way faster (and saves them from another endless loop 🌀.

 

 

If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster.
Connect with me on LinkedIn

Thank you, for parameterizing a copy data activity how are we supposed to dynamically set up the mappings? I currently have the a build a pipeline for every single copy data I want to run with the mappings within each and it's definitely not best practice

When you parameterise a Copy Data activity, you can also make the schema mapping dynamic by using metadata and expressions instead of hardcoding mappings in each pipeline.
 

Store Mappings in a Metadata Table

Create a control table (or JSON file) that contains:
  • Source table name
  • Target table name
  • Column mappings in JSON format, for example:
[
  { "source": "CustomerID", "sink": "Cust_ID" },
  { "source": "CustomerName", "sink": "Cust_Name" }
]

 

Pass Mapping to Copy Activity

In the Copy Data activity, set the translator property dynamically:
 
 
"translator": {
    "type": "TabularTranslator",
    "mappings": "@activity('GetMapping').output.value"
}

 

Here, GetMapping is the Lookup activity that retrieved the JSON mapping.

 

If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster.
Connect with me on LinkedIn

Hi @BriefStop,

 

Thank you for reaching out to Microsoft Fabric Community.

 

Thank you @Zanqueta and @nielsvdc for the prompt response. 

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

Hi @BriefStop,

 

We wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

nielsvdc
Super User
Super User

Hi @BriefStop, Copy Jobs are designed as standalone, scheduled tasks rather than pipeline activities, which means they don’t support orchestration features like parallel execution within a pipeline loop. That’s why you’re seeing the error that Fabric prevents multiple runs of the same Copy Job at the same time for consistency and resource management.

 

To use Copy Jobs effectively, you specify multiple tables for a single source and the Copy Job will run all the copy processes parallel to eachother. But Copy Jobs are not design with parameters, enabling them to be executed with dynamic input.

 

When you want to build a metadata driven pipeline, your options are to use a Copy Data activity or use a notebook with PySpark code. The latter one is effectively the solution with the least overhead compute.

 

Hope this helps. If so, please give kudos 👍 and mark as Accepted Solution ✔️ to help others.

Helpful resources

Announcements
Sticker Challenge 2026 Carousel

Join our Community Sticker Challenge 2026

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!

Free Fabric Certifications

Free Fabric Certifications

Get Fabric certified for free! Don't miss your chance.

January Fabric Update Carousel

Fabric Monthly Update - January 2026

Check out the January 2026 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.