Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Midway
Helper I
Helper I

Can I use stored procedures to import data from an on-prem sql server to a lakehouse?

I have been re-searching a lot on this topic and no luck so I decided to post this here to see if someone out there has implemented this method somehow. the issue I finding is when using the Copy data activity for large tables, it takes soooo long to bring like 3 million records into a lakehouse table and that it's not acceptable. So I'm trying to use stored procedures to see if I can bring the big tables into a lakehouse to speed up the process somehow.

Let me know please.

Thanks!

 

1 ACCEPTED SOLUTION
jwinchell40
Super User
Super User

@Midway - There are a number of levers you can pull to further tune the Copy Activity process.  Sandeep Pawar has a good post covering some of the options (Boosting Copy Activity Throughput in Fabric).  You could also look at the "Copy Job" feature that was just released in preview.  Its supposed to be faster yet than Copy Activity.

 

Here is another good article:

Copy activity performance and scalability guide - Azure Data Factory & Azure Synapse | Microsoft Lea...

View solution in original post

3 REPLIES 3
jwinchell40
Super User
Super User

@Midway - There are a number of levers you can pull to further tune the Copy Activity process.  Sandeep Pawar has a good post covering some of the options (Boosting Copy Activity Throughput in Fabric).  You could also look at the "Copy Job" feature that was just released in preview.  Its supposed to be faster yet than Copy Activity.

 

Here is another good article:

Copy activity performance and scalability guide - Azure Data Factory & Azure Synapse | Microsoft Lea...

smeetsh
Resolver II
Resolver II

You will need to create an on-prem gateway in your local network, that has access to the sql server you are trying to get the data from.

 

Once you have a gateway , create a  sql connection, using that on-prem gateway, connecting it to your local sql server., 

Last but not least.  create a fabric pipeline, with a copy data action, in that action choose your sql connection you created above, selecting the DB you need, and the table you need.

That will allow you to copy the whoel table, the other option, probably more what you are looking for is creating a script activity in a pipeling, using the above connection and just running a script

smeetsh_0-1728609991845.png

 




NandanHegde
Super User
Super User

Unfortunately, that is not possible as On Prem and lakehouse are seperate frameworks and cannot be linked directly for you to create a stored procedure and query the other.

You would have to use some data transfer tool which can be either data pipeline copy activity or dataflow Gen 2




----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.