Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.

Reply
Midway
Helper I
Helper I

Can I use stored procedures to import data from an on-prem sql server to a lakehouse?

I have been re-searching a lot on this topic and no luck so I decided to post this here to see if someone out there has implemented this method somehow. the issue I finding is when using the Copy data activity for large tables, it takes soooo long to bring like 3 million records into a lakehouse table and that it's not acceptable. So I'm trying to use stored procedures to see if I can bring the big tables into a lakehouse to speed up the process somehow.

Let me know please.

Thanks!

 

1 ACCEPTED SOLUTION
jwinchell40
Super User
Super User

@Midway - There are a number of levers you can pull to further tune the Copy Activity process.  Sandeep Pawar has a good post covering some of the options (Boosting Copy Activity Throughput in Fabric).  You could also look at the "Copy Job" feature that was just released in preview.  Its supposed to be faster yet than Copy Activity.

 

Here is another good article:

Copy activity performance and scalability guide - Azure Data Factory & Azure Synapse | Microsoft Lea...

View solution in original post

3 REPLIES 3
jwinchell40
Super User
Super User

@Midway - There are a number of levers you can pull to further tune the Copy Activity process.  Sandeep Pawar has a good post covering some of the options (Boosting Copy Activity Throughput in Fabric).  You could also look at the "Copy Job" feature that was just released in preview.  Its supposed to be faster yet than Copy Activity.

 

Here is another good article:

Copy activity performance and scalability guide - Azure Data Factory & Azure Synapse | Microsoft Lea...

smeetsh
Helper V
Helper V

You will need to create an on-prem gateway in your local network, that has access to the sql server you are trying to get the data from.

 

Once you have a gateway , create a  sql connection, using that on-prem gateway, connecting it to your local sql server., 

Last but not least.  create a fabric pipeline, with a copy data action, in that action choose your sql connection you created above, selecting the DB you need, and the table you need.

That will allow you to copy the whoel table, the other option, probably more what you are looking for is creating a script activity in a pipeling, using the above connection and just running a script

smeetsh_0-1728609991845.png

 




NandanHegde
Super User
Super User

Unfortunately, that is not possible as On Prem and lakehouse are seperate frameworks and cannot be linked directly for you to create a stored procedure and query the other.

You would have to use some data transfer tool which can be either data pipeline copy activity or dataflow Gen 2




----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.