Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have to copy n tables from a on-prem SQL DB to a lakehouse with a copy job
do I create n different copy jobs (similar to create n differnt dataflows)
or
can I create a single copy job to accomodate copying all n tables in the same job (similar to create 1 dataflow that has n queries)
Hi @smpa01 ,
Could you please confirm if you've submitted this as an idea in the Ideas Forum? If so, sharing the link here would be helpful for other community members who may have similar feedback.
If we don’t hear back, we’ll go ahead and close this thread. For any further discussions or questions, please start a new thread in the Microsoft Fabric Community Forum we’ll be happy to assist.
Thank you for being part of the Microsoft Fabric Community.
Hi @smpa01 ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @smpa01 ,
Thank you for reaching out to the Microsoft Fabric Community Forum.
As @suparnababu8 mentioned, you can use a single Copy Job to transfer multiple tables from your on-premises SQL Server to a Lakehouse. When setting up the Copy Job, you can select and configure several tables in one job, which makes monitoring easier and reduces overhead. This method works best if the tables have the same refresh schedule and similar load settings.
However, if some tables need different refresh schedules or specific load strategies (such as full versus incremental loads with different watermark columns), it’s better to create separate copy jobs for those tables.
Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.
Thank you.
Copy Job is different from Data factory[Copy activity]. I was referring to the former and not the latter. Also, once I create that Copy Job for the first table, how do I go back and edit the job (adding the 2nd table for copying, adding the 3rd table for copying etc) @v-tsaipranay
Hi @smpa01 ,
Thanks for your follow-up and for pointing out the difference between Fabric’s Copy Job and the traditional Data Factory Copy activity.
Right now, once you create a Copy Job in Microsoft Fabric, you can’t go back and edit it to add or remove tables. If you need to include more tables, the best option is to create a new Copy Job with the updated selection.
We understand this can be limiting, and we really appreciate your feedback. Improvements in this area are likely in future updates. If you’d like to share your thoughts directly with the product team, please consider submitting your idea through the forum link below: Fabric Ideas - Microsoft Fabric Community
If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.
Thank you.
Hi @smpa01
No need to create n different copy jobs to copy n number of tables from your on-premises to Lakehouse. Just one copy activity is enough. Recently I publshed a blog on this in Fabirc blogs sesction.
Please go throuh this blog - Seamless Data Migration from On-Prem SQL Server to... - Microsoft Fabric Community
Hope this helps you.
Thank you!
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
User | Count |
---|---|
20 | |
17 | |
6 | |
2 | |
2 |
User | Count |
---|---|
49 | |
43 | |
17 | |
6 | |
5 |