Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!

Reply
JibinSebastian
Helper II
Helper II

Copy activity uses Linked services in Dev but Connection setting in UAT

Hello Folk,

I use copy activity to fetch table from sql server and copied the table to lakehouse and tables are present in Tables/bronze/table_name

JibinSebastian_0-1763570459083.png

Then when I deployed this pipeline into UAT Tables are coming right under the Tables/table_names

JibinSebastian_1-1763570539322.png

And Copy activity destination is same in Dev and UAT

JibinSebastian_2-1763570637845.png

 


When I  digged deep i could figured out that in dev it is using linked service and in UAT it is Connection setting.

Could someone help me to fix this issue? any leads would much appreciated






1 ACCEPTED SOLUTION

@v-karpurapud 

Thnk you so much for responding to my message, I really appreciate it and appologies for late response.
I Create a new injestion pipeline and solved the issue.

Thanks again.

View solution in original post

7 REPLIES 7
apturlov
Responsive Resident
Responsive Resident

@JibinSebastian Could you clarify: are you running a copy activity for each individual SQL table at a time? Could you share a screenshot of a pipeline that you use? How do you supply a list of source tables in SQL database? Where is your SQL database: Azure, Fabric or on-premises?

apturlov
Responsive Resident
Responsive Resident

Hi @JibinSebastian, I am bit confused when you say that Copy activity uses a "Linked service". As far as I am concerned there are no linked services in Microsoft Fabric.

 

Now let's calrify what you mean by "Dev" and "UAT". Are those different workspaces? If so, do you have a lakehouse with the same name in each workspace?

 

The screenshot from your Copy activity configuration looks different from what I expect: it's missing a Lakehouse name even though it's using a lakehouse connection. I would expect a different configuration screen:

apturlov_0-1763600664635.png

Technically, you should also parameterize your destination lakehouse name in the Copy activity configuration, because when you move your artifacts between environments it should be expected that the Lakhouse stores are also different.

 

Finally, in your UAT screenshot what you are seeing under Tables are not delta tables but namespaces. In a file structure of a lakehouse the namespaces are subfolders for the delta tables. So it seems your copy activity did not even produced the delta tables at all.

 

More investigation is required.

@apturlov  Thanks for responding.

Dev, UAT, PROD are different Workspaces
I do have the same lakehouse name in three different workspaces. "BL_IEDP" This is the lakehouse present in all three workspaces.
Below given is the screenshot of current copy activity.

JibinSebastian_0-1763647473517.png

The dynamic variable @Item().TargetSchema gets the value "bronze" and @Item().TargetSchema gets corresponding table name.

The issue is i wanted to move data from sql to bronze schema in lakehouse, unfortunately this copy activity is creating new schemas for each tables.



Hi @JibinSebastian 

Thank you for reaching out to the Microsoft Community Forum.

 

Youโ€™re correct, having identical lakehouse names in different workspaces is fine, each workspace has its own separate BL_IEDP lakehouse but the copy activity must target the correct lakehouse and pass the table identifier in the exact format the sink expects. What you described (copy creating new schemas for every table) strongly points to a problem in the Table expression you are passing to the Lakehouse sink.

Use one combined expression for the Table field so the sink gets schema.table, for example concat(item().TargetSchema, '.', item().TargetTableName) (or @concat(item().TargetSchema,'.',item().TargetTableName)). Ensure the sink is pointed at the Tables root and the correct lakehouse connection (parameterize the lakehouse for safe deployments), set the Table action/writeMode to create or upsert so Delta tables are created, then run a debug and inspect the evaluated Table string in the pipeline run to confirm the exact target. If namespaces still appear, check TargetSchema/TargetTableName for stray characters (spaces, slashes) and paste the evaluated Table string here.

If you have any more questions, please let us know and weโ€™ll be happy to help.

Regards,

Microsoft Fabric Community Support Team

@v-karpurapud 

Thnk you so much for responding to my message, I really appreciate it and appologies for late response.
I Create a new injestion pipeline and solved the issue.

Thanks again.

Hi @JibinSebastian 

Thank You for the Update. If you have any more questions, please let us know and weโ€™ll be happy to help.

Regards,

Microsoft Fabric Community Support Team

Fabric Community is always a platform which helps me to solve my fabric related challenges. Thanks. In future if I face any I will definetly shoot it here.

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.