Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
bcdobbs
Super User
Super User

Linked Service in ADF to One Lake

Hi,

 

I was wondering if anyone knew of a work around to let existing resources in ADF or Azure Synapse write to One Lake.

 

I was hoping I could configure a service principal and then setup a linked service in ADF. However it enforces a name convention of the URL:

bcdobbs_0-1685345429306.png

 

For reference I took the URL from: OneLake access and APIs - Microsoft Fabric | Microsoft Learn 

Thanks

Ben
Ben



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!
6 REPLIES 6
GeethaT-MSFT
Community Support
Community Support

@bcdobbs It depends on what you want to do with the data, if your source data is already structured/curated then you don't generally need to copy it again in the lakehouse/warehouse, shortcuts will work.

GeethaT-MSFT
Community Support
Community Support

Hi @bcdobbs  Any reasons to not use Fabric Pipelines to load into OneLake?

@GeethaT-MSFT   Will the connectors for the Fabric copy data activity be expanded to match what is available in Azure Data Factory?  For instance, I have on premises Oracle data that I would like to land in OneLake.

 

Thank you.  

@kdoherty Yes, on-prem connectivity is tracked for GA, Until then a workaround is to use on-prem gateways in pbi to stage the data in a cloud location and then use copy activity.

 

That's good news.  Thank you for replying.  

I fully intend to once it supports what I need! Currently Fabric copy data activity in pipeline only has very basic API support; I need to be able to pass a bearer key in the auth header (which I've ideally retrieved from key vault or new fabric equivalent), or for internal Microsoft stuff use a managed identity/service principal.

 

I was just thinking until that was available (I understand it's coming) I could use ADF in azure to land the data while I experiment with fabric.

 

That said having now played with it more I think the way to go is simply create a shortcut to my exisiting ADLS storage and get the data in that way.



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

Fabric Community Conference

Microsoft Fabric Community Conference

Join us at our first-ever Microsoft Fabric Community Conference, March 26-28, 2024 in Las Vegas with 100+ sessions by community experts and Microsoft engineering.

Top Solution Authors
Top Kudoed Authors