Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have been re-searching a lot on this topic and no luck so I decided to post this here to see if someone out there has implemented this method somehow. the issue I finding is when using the Copy data activity for large tables, it takes soooo long to bring like 3 million records into a lakehouse table and that it's not acceptable. So I'm trying to use stored procedures to see if I can bring the big tables into a lakehouse to speed up the process somehow.
Let me know please.
Thanks!
Solved! Go to Solution.
@Midway - There are a number of levers you can pull to further tune the Copy Activity process. Sandeep Pawar has a good post covering some of the options (Boosting Copy Activity Throughput in Fabric). You could also look at the "Copy Job" feature that was just released in preview. Its supposed to be faster yet than Copy Activity.
Here is another good article:
@Midway - There are a number of levers you can pull to further tune the Copy Activity process. Sandeep Pawar has a good post covering some of the options (Boosting Copy Activity Throughput in Fabric). You could also look at the "Copy Job" feature that was just released in preview. Its supposed to be faster yet than Copy Activity.
Here is another good article:
You will need to create an on-prem gateway in your local network, that has access to the sql server you are trying to get the data from.
Once you have a gateway , create a sql connection, using that on-prem gateway, connecting it to your local sql server.,
Last but not least. create a fabric pipeline, with a copy data action, in that action choose your sql connection you created above, selecting the DB you need, and the table you need.
That will allow you to copy the whoel table, the other option, probably more what you are looking for is creating a script activity in a pipeling, using the above connection and just running a script
Unfortunately, that is not possible as On Prem and lakehouse are seperate frameworks and cannot be linked directly for you to create a stored procedure and query the other.
You would have to use some data transfer tool which can be either data pipeline copy activity or dataflow Gen 2
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
1 | |
1 | |
1 | |
1 | |
1 |