Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Use Case/Story:
I have many SQL Server datasouces configured with my on-premises gateway (Lets say 20-50 datasources). I need to copy a sizable amount of data from each database (lets say 50-200 tables) from each datasource. I need to put that data into a corresponding Lakehouse (20-50 lakehouses, one for each datasource). This is a prequisite to creating my semantic models, reports, etc.
Method:
To copy that much data, it is my understanding that Microsoft would reccomend a data pipeline that utilizes the copy data activity to perform the data transfer. Ideal solution is to loop over all my datasources and invoke a copy data activity for each datasource/lakehouse pairing.
Problem:
The copy data pipeline activity does not support dynamic content for SQL Server connection types. I would need to create, deploy, and maintain 20-50 pipelines, all almost exactly identical, with the only distinguishing feature being the selected SQL server connection on the source configuration. This solution is untenable due to the maintenance requirement of having that many pipelines, along with the added requirement of having to have a person designate the connection via the UI, or implment a git script to identify the datasource and apply it to the pipeline (or otherwise store 50 copies of basically the same pipeline in my repository?).
(Im)possible Workarounds/Non-Solutions:
Questions:
I don't have a solution but I can see where you would find that having this feature could be beneficial. While having that many similar pipelines will require a bit of initial setup, I have found that there are benefits to having multiple pipelines.
My organization has a simliar setup with many servers, usually one for each vendor. Sometimes we experience an issue with a load failing from the vendor to our sql server, which will then cause our pipeline to fail. With having individual pipelines for each server, the impact of a load failing is limited to just the tables loaded in that pipeline, rather than our entire data library.
Hi @IntegrateGuru ,
Have you solved your problem? If so, can you share your solution here and mark the correct answer as a standard answer to help other members find it faster? Thank you very much for your kind cooperation!
Best Regards
Yilong Zhou
Hi @IntegrateGuru ,
Here are some of my thoughts and suggestions that I hope will help you.
Regarding dynamic content support: Currently, the Copy Data Activity (CDA) really does not support dynamic content for SQL Server connection types. This means that you need to create, deploy and maintain multiple nearly identical pipelines for each different SQL Server connection, which does create a significant maintenance effort. You can also check out this topic for more information: Solved: Re: Dynamic Connection To Sources - Microsoft Fabric Community
Regarding whether there are plans to add support: Based on current information, there are no definite plans to add dynamic content support for SQL Server connection types. You can also raise your expectations in the Idea, here is the website: Home (microsoft.com)
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
User | Count |
---|---|
4 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
9 | |
2 | |
2 | |
2 | |
2 |