Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!
Hello Folk,
I use copy activity to fetch table from sql server and copied the table to lakehouse and tables are present in Tables/bronze/table_name
Then when I deployed this pipeline into UAT Tables are coming right under the Tables/table_names
And Copy activity destination is same in Dev and UAT
When I digged deep i could figured out that in dev it is using linked service and in UAT it is Connection setting.
Could someone help me to fix this issue? any leads would much appreciated
Solved! Go to Solution.
@v-karpurapud
Thnk you so much for responding to my message, I really appreciate it and appologies for late response.
I Create a new injestion pipeline and solved the issue.
Thanks again.
@JibinSebastian Could you clarify: are you running a copy activity for each individual SQL table at a time? Could you share a screenshot of a pipeline that you use? How do you supply a list of source tables in SQL database? Where is your SQL database: Azure, Fabric or on-premises?
Hi @JibinSebastian, I am bit confused when you say that Copy activity uses a "Linked service". As far as I am concerned there are no linked services in Microsoft Fabric.
Now let's calrify what you mean by "Dev" and "UAT". Are those different workspaces? If so, do you have a lakehouse with the same name in each workspace?
The screenshot from your Copy activity configuration looks different from what I expect: it's missing a Lakehouse name even though it's using a lakehouse connection. I would expect a different configuration screen:
Technically, you should also parameterize your destination lakehouse name in the Copy activity configuration, because when you move your artifacts between environments it should be expected that the Lakhouse stores are also different.
Finally, in your UAT screenshot what you are seeing under Tables are not delta tables but namespaces. In a file structure of a lakehouse the namespaces are subfolders for the delta tables. So it seems your copy activity did not even produced the delta tables at all.
More investigation is required.
@apturlov Thanks for responding.
Dev, UAT, PROD are different Workspaces
I do have the same lakehouse name in three different workspaces. "BL_IEDP" This is the lakehouse present in all three workspaces.
Below given is the screenshot of current copy activity.
The dynamic variable @Item().TargetSchema gets the value "bronze" and @Item().TargetSchema gets corresponding table name.
The issue is i wanted to move data from sql to bronze schema in lakehouse, unfortunately this copy activity is creating new schemas for each tables.
Thank you for reaching out to the Microsoft Community Forum.
Youโre correct, having identical lakehouse names in different workspaces is fine, each workspace has its own separate BL_IEDP lakehouse but the copy activity must target the correct lakehouse and pass the table identifier in the exact format the sink expects. What you described (copy creating new schemas for every table) strongly points to a problem in the Table expression you are passing to the Lakehouse sink.
Use one combined expression for the Table field so the sink gets schema.table, for example concat(item().TargetSchema, '.', item().TargetTableName) (or @concat(item().TargetSchema,'.',item().TargetTableName)). Ensure the sink is pointed at the Tables root and the correct lakehouse connection (parameterize the lakehouse for safe deployments), set the Table action/writeMode to create or upsert so Delta tables are created, then run a debug and inspect the evaluated Table string in the pipeline run to confirm the exact target. If namespaces still appear, check TargetSchema/TargetTableName for stray characters (spaces, slashes) and paste the evaluated Table string here.
If you have any more questions, please let us know and weโll be happy to help.
Regards,
Microsoft Fabric Community Support Team
@v-karpurapud
Thnk you so much for responding to my message, I really appreciate it and appologies for late response.
I Create a new injestion pipeline and solved the issue.
Thanks again.
Hi @JibinSebastian
Thank You for the Update. If you have any more questions, please let us know and weโll be happy to help.
Regards,
Microsoft Fabric Community Support Team
Fabric Community is always a platform which helps me to solve my fabric related challenges. Thanks. In future if I face any I will definetly shoot it here.