Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount.
Register nowGet certified as a Fabric Data Engineer: Check your eligibility for a 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700. Get started
Hi,
I was wondering if anyone knew of a work around to let existing resources in ADF or Azure Synapse write to One Lake.
I was hoping I could configure a service principal and then setup a linked service in ADF. However it enforces a name convention of the URL:
For reference I took the URL from: OneLake access and APIs - Microsoft Fabric | Microsoft Learn
Thanks
Ben
Ben
@bcdobbs It depends on what you want to do with the data, if your source data is already structured/curated then you don't generally need to copy it again in the lakehouse/warehouse, shortcuts will work.
@GeethaT-MSFT Will the connectors for the Fabric copy data activity be expanded to match what is available in Azure Data Factory? For instance, I have on premises Oracle data that I would like to land in OneLake.
Thank you.
@kdoherty Yes, on-prem connectivity is tracked for GA, Until then a workaround is to use on-prem gateways in pbi to stage the data in a cloud location and then use copy activity.
That's good news. Thank you for replying.
I fully intend to once it supports what I need! Currently Fabric copy data activity in pipeline only has very basic API support; I need to be able to pass a bearer key in the auth header (which I've ideally retrieved from key vault or new fabric equivalent), or for internal Microsoft stuff use a managed identity/service principal.
I was just thinking until that was available (I understand it's coming) I could use ADF in azure to land the data while I experiment with fabric.
That said having now played with it more I think the way to go is simply create a shortcut to my exisiting ADLS storage and get the data in that way.