Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount.
Register nowGet certified as a Fabric Data Engineer: Check your eligibility for a 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700. Get started
I've been testing the Copy Data activity from On Prem SQL server via a Data Gateway which was recently made available. I have successfully managed to load into a Lakehouse, but when trying to get Data into a Warehouse it requires that staging is enabled but does not support Workspace Staging and needs a connection to an Azure Environment. I'm assuming this is intentional, but are there any plans to allow this to use the StagingWarehouse objects in the future? It would be nice to keep everything in OneLake which would seem to fit with it's described purpose.
Solved! Go to Solution.
Hi @MOVC , @Joshrodgers123 , @ShaunBrewer
Internal Team confimed that, it is on the roadmap. At present we don't have any ETA. But we can expect this feature in future.
Currently, it is not possible to directly copy data from an on-premises SQL Server (via the gateway) to a Fabric Warehouse. The recommended workaround is to first copy data from SQL Server to a Lakehouse (copy actiovity 1)and then move it from the Lakehouse to the Warehouse (Copy Activity 2). However, this is not an optimal solution, especially for small datasets, as it introduces unnecessary complexity and inefficiency.
We request the Fabric team to address this issue as soon as possible. Delaying this feature may lead to significant rework in the future if direct data transfer from SQL Server to the Warehouse is eventually enabled.
Hello and so many thanks. There are links that indicates it is posssible This link actually explains how to do it. Microsoft Fabric: Import On Premise SQL Server 2022 Data into Microsoft Fabric with Data Pipelines!! Can you kindly take a look? Sincerely appreciate your help
One work around may be to use the gateway to get the information, but in stead of writing it to a table, you write it to a scv file, in the next activity you read that csv file and ingest into the warehouse. We ingest several csv files that way.
This way the lakehouse is only used to store a file, and nothing else.
Hi @MOVC ,
Thanks for using Fabric Community.
A simple workaround is by loading this to a cloud Lakehouse first, and then another copy activity to write that to WH.
In between we are also reaching out to the internal team to get some help on this .
We will update you once we hear back from them.
Hi @MOVC , @Joshrodgers123 , @ShaunBrewer
Internal Team confimed that, it is on the roadmap. At present we don't have any ETA. But we can expect this feature in future.
Hi - is there any update to this?
Copying from on-prem -> Lakehouse -> Warehouse isn't a practical solution for us. Is there an alternative? ( sorry am new to Fabric ).
Hi @MOVC ,
Glad to know that your query was resolved. Please continue using Fabric Community on your further queries.
I posted the same thing yesterday. Seems to go against the "OneLake" concept to have to bring additional Azure resources to copy to a warehouse.
Good question - I would like to know the answer too.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
User | Count |
---|---|
6 | |
2 | |
1 | |
1 | |
1 |