Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
RajeshKapur
Helper I
Helper I

Data Pipeline Copy Data Update source using Deployment Rules

Hi,

 

We have 2 On Prem Data Gateways for Dev and QA, and created 2 different connections with connection type as SQL Server and basic Auth for each environment using respective On Prem Data Gateways.

 

Also,have created a data pipeline on Dev Workspace with Copy data where we have selected the Dev SQL Server connection as source. Now when we moved the Pipeline from Dev workspace to QA workspace using Deployment Pipeline, how the source database can be updated with QA connection using Deployment Rules.

 

Thanks,
Rajesh Kapur

 

1 ACCEPTED SOLUTION

Hi,


To resolve the above problem we are planning to perform following steps:

 

Step 1: Create 2 separate connections inside Fabric "Manage Connections and Gateway" say SalesDB_Dev and SalesDB_Test and keep note of both the Connection Ids.

Step 2: Create new Data Pipeline with CopyData action with SalesDB_Dev connection configured as Source and Lakehouse configured as Sink.

Step 3: Enable GIT Integration on Dev Workspace and sync the content with app_dev branch.

Step 4: Create new branch app_test branch out of app_dev branch to have all the synced code to app_test branch.

Step 5: Open the pipeline code file i.e. "pipeline-content.json" and update the connection id from SalesDB_Dev to SalesDB_Test. For reference

"externalReferences": {
                      "connection": "d350327d-50da-4947-a9b2-f61ab33e1d37"
                    },

Step 6: Enable GIT Integration on Test Workspace to sync the content from app_test branch.

Step 7: Once sync completes, validate the connection in the pipeline copydata action and it should refer to SalesDB_Test.

 

Once the above approach seems to be working fine and there is no issue/challenge in going with this, we can automate the process of updating the connection id in pipeline-content.json and syncing to respective workspace using Fabric API through ADO  pipelines.

 

Do revert in case anyone see any challenge/issue in this.

View solution in original post

4 REPLIES 4
RajeshKapur
Helper I
Helper I

Hi,

 

As an option to resolve this we are planning to perform following steps:

 

Step 1: Create 2 separate connections under Fabric "Manage Connections and Gateway" say SalesDB_Dev and SalesDB_Test

 

Step 2: Create Data Pipeline with CopyData Action configured with SalesDB_Dev connection as Source and Lakehouse as Sink.

 

Step 3: Enable GIT Integration on Dev Workspace and sync with say app_dev branch.

 

Step 4: Once the code synced with app_dev branch; create new app_test branch out of app_dev branch.

 

Step 5: Open the "pipeline-content.json" file of Pipeline created on app_test branch and update connection value in JSON from SalesDB_Dev connection Id to SalesDB_Test connection Id."externalReferences": { "connection": "d350327d-50da-4947-a9b2-f61ab33e1d37" }​ Step 6: Enable GIT Integration for Test Workspace and sync the content from app_test branch to Test Workspace.


Step 7: Validate the connection inside the pipeline and it should be SalesDB_Test Connection.

 

Do let me know if anyone see any challenge or issue in this approach. Once we are good we can automate the process using ADO Pipelines where Pipeline will update the connection first in the test branch and then using Fabric APIs the content get sync to respective Workspace.

 

Thanks,

Rajesh Kapur 

 

 

FabianSchut
Super User
Super User

Hi, unfortunately, you cannot update the connection with deployment rules. That is not supported yet. One workaround is to use a case statement in the pipeline based on the workspace id: @pipeline.DataFactory()

and use one copy activity in each of the cases. Use the dev connection and gateway if the workspace id matches your dev workspace and use the other for QA. 

Hi,


To resolve the above problem we are planning to perform following steps:

 

Step 1: Create 2 separate connections inside Fabric "Manage Connections and Gateway" say SalesDB_Dev and SalesDB_Test and keep note of both the Connection Ids.

Step 2: Create new Data Pipeline with CopyData action with SalesDB_Dev connection configured as Source and Lakehouse configured as Sink.

Step 3: Enable GIT Integration on Dev Workspace and sync the content with app_dev branch.

Step 4: Create new branch app_test branch out of app_dev branch to have all the synced code to app_test branch.

Step 5: Open the pipeline code file i.e. "pipeline-content.json" and update the connection id from SalesDB_Dev to SalesDB_Test. For reference

"externalReferences": {
                      "connection": "d350327d-50da-4947-a9b2-f61ab33e1d37"
                    },

Step 6: Enable GIT Integration on Test Workspace to sync the content from app_test branch.

Step 7: Once sync completes, validate the connection in the pipeline copydata action and it should refer to SalesDB_Test.

 

Once the above approach seems to be working fine and there is no issue/challenge in going with this, we can automate the process of updating the connection id in pipeline-content.json and syncing to respective workspace using Fabric API through ADO  pipelines.

 

Do revert in case anyone see any challenge/issue in this.

Looks like a plan that theoretically could work. I've tried to code a pipeline parameter in the connection part of the json code, but that did not work. A hard-coded connection id may work. I'm interested too. Please ley us know how it went.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.