Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
bcdobbs
Community Champion
Community Champion

Linked Service in ADF to One Lake

Hi,

 

I was wondering if anyone knew of a work around to let existing resources in ADF or Azure Synapse write to One Lake.

 

I was hoping I could configure a service principal and then setup a linked service in ADF. However it enforces a name convention of the URL:

bcdobbs_0-1685345429306.png

 

For reference I took the URL from: OneLake access and APIs - Microsoft Fabric | Microsoft Learn 

Thanks

Ben
Ben



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!
6 REPLIES 6
GeethaT-MSFT
Microsoft Employee
Microsoft Employee

@bcdobbs It depends on what you want to do with the data, if your source data is already structured/curated then you don't generally need to copy it again in the lakehouse/warehouse, shortcuts will work.

GeethaT-MSFT
Microsoft Employee
Microsoft Employee

Hi @bcdobbs  Any reasons to not use Fabric Pipelines to load into OneLake?

@GeethaT-MSFT   Will the connectors for the Fabric copy data activity be expanded to match what is available in Azure Data Factory?  For instance, I have on premises Oracle data that I would like to land in OneLake.

 

Thank you.  

@kdoherty Yes, on-prem connectivity is tracked for GA, Until then a workaround is to use on-prem gateways in pbi to stage the data in a cloud location and then use copy activity.

 

That's good news.  Thank you for replying.  

I fully intend to once it supports what I need! Currently Fabric copy data activity in pipeline only has very basic API support; I need to be able to pass a bearer key in the auth header (which I've ideally retrieved from key vault or new fabric equivalent), or for internal Microsoft stuff use a managed identity/service principal.

 

I was just thinking until that was available (I understand it's coming) I could use ADF in azure to land the data while I experiment with fabric.

 

That said having now played with it more I think the way to go is simply create a shortcut to my exisiting ADLS storage and get the data in that way.



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

May 2025 Monthly Update

Fabric Community Update - May 2025

Find out what's new and trending in the Fabric community.