Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.

Reply
hhuikuri
New Member

Issue with Copy Job and Variable Library: Destination Schema and Table Name becomes Invalid

Hello Fabric Community,

 

I’m working with Copy Job in Microsoft Fabric and trying to parameterize connections using the Variable Library. I’ve encountered an issue after switching the destination connection to use a variables.

 

Here’s what happens:

  1. Before using Variable Library:
    • The destination connection is fixed.
    • In Table Mapping, I can define the destination schema and table name correctly:
      hhuikuri_1-1764835163023.png
  2. After editing the destination connection to use Variable Library:
    hhuikuri_2-1764835318873.png
  3. The destination schema field disappears from Table Mapping and The table name becomes invalid:

    hhuikuri_3-1764835513079.png
  4. When the Copy Job runs, data is saved to the dbo schema and an incorrect table name like dbo_tblTarget.

Is there a workaround for this scenario? Any guidance or examples would be greatly appreciated!

1 ACCEPTED SOLUTION
v-dineshya
Community Support
Community Support

Hi @hhuikuri ,

Thank you for reaching out to the Microsoft Community Forum.

 

Copy Job does not support schema/table paths, only plain table names. Copy Job only supports dynamic table names when the connection is static. Once the connection itself is variable-driven:

 

1. schema cannot be overridden

2. table mapping becomes “best-effort”

3. metadata lookup fails

4. Copy Job reverts to default: dbo.<table>

This is why your result remains the same no matter what table name you type.

 

Note: Copy Job cannot dynamically change schema when using variables by design it is a current limitation.

 

Please try below alternative workarounds.

 

1. Keep connection static, only parameterize table name. If your Lakehouse does not change, do NOT store connection ID in Variable Library.

Instead, Use the normal Lakehouse connection, Only set destination table name = variable

 

Example:

Schema Table
dbo @{variables('tableName')}

 

2. Use Pipelines instead of Copy Job, If you must parameterize connection ID, workspace ID, lakehouse ID: Pipeline Copy Activity supports dynamic schema + table fully. Copy Job does NOT.

 

Please refer below link.

How to create a Copy job in Data Factory - Microsoft Fabric | Microsoft Learn

 

I hope this information helps. Please do let us know if you have any further queries.

 

Regards,

Dinesh

View solution in original post

6 REPLIES 6
v-dineshya
Community Support
Community Support

Hi @hhuikuri ,

Thank you for reaching out to the Microsoft Community Forum.

 

Copy Job does not support schema/table paths, only plain table names. Copy Job only supports dynamic table names when the connection is static. Once the connection itself is variable-driven:

 

1. schema cannot be overridden

2. table mapping becomes “best-effort”

3. metadata lookup fails

4. Copy Job reverts to default: dbo.<table>

This is why your result remains the same no matter what table name you type.

 

Note: Copy Job cannot dynamically change schema when using variables by design it is a current limitation.

 

Please try below alternative workarounds.

 

1. Keep connection static, only parameterize table name. If your Lakehouse does not change, do NOT store connection ID in Variable Library.

Instead, Use the normal Lakehouse connection, Only set destination table name = variable

 

Example:

Schema Table
dbo @{variables('tableName')}

 

2. Use Pipelines instead of Copy Job, If you must parameterize connection ID, workspace ID, lakehouse ID: Pipeline Copy Activity supports dynamic schema + table fully. Copy Job does NOT.

 

Please refer below link.

How to create a Copy job in Data Factory - Microsoft Fabric | Microsoft Learn

 

I hope this information helps. Please do let us know if you have any further queries.

 

Regards,

Dinesh

Thank you! I need to copy over 200 tables from the source database on a daily basis, so I implemented the solution using a Fabric pipeline. I created a configuration table that the Lookup activity uses to retrieve all source and destination tables. Then, within a ForEach loop, the tables are copied using the Copy activity. Additionally, I configured the pipeline to leverage the Variable library, which makes it much easier to deploy across environments from development to test and from test to production.
hhuikuri
New Member

I have tried table mapping without the schema using only the table name:

hhuikuri_2-1764841407149.png
After I apply the chance, the result stays still same:
hhuikuri_1-1764841362954.png

I believe the following topic might be the solution:

 

https://community.fabric.microsoft.com/t5/Pipelines/Load-data-to-different-schema-in-Fabric-Lakehous...

Hope this helps.


[Tip] Keep CALM and DAX on.
[Solved?] Hit “Accept as Solution” and leave a Kudos.
[About] Chiel | SuperUser (2023–2) |

Thanks, but this solution is for pipeline Copy Activity. Now I am struggling with Copy Job, which is not a same.

ChielFaber
Solution Specialist
Solution Specialist

I've tried searching for an answer for your problem. Could you try to not specify the .dbo part. Just set the variable to tblTarget and see if that helps. 

 

It seems to be the problem that If you enter dbo.tblTarget, Fabric interprets that as a table name containing a dot, which is not allowed for Lakehouse tables – hence the validation error.

 

From my understanding for Lakehouse tables the schema is always dbo, so when schema is parameterized the activity assumes dbo implicitly and only lets you edit the table name.


[Tip] Keep CALM and DAX on.
[Solved?] Hit “Accept as Solution” and leave a Kudos.
[About] Chiel | SuperUser (2023–2) |

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.