Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
We need to connect to OneLake from the standalone ADF service (adf.azure.com) or from Synapse pipelines. After the connectivity is established we would simply like to use the Copy Activity to write files to OneLake.
Note: We cannot use the ADF pipelines in Fabric itself because our source data is on-prem and ADF pipelines in Fabric do not support SHIR / Data Gateway. Please see Differences between Data Factory in Fabric and Azure - Microsoft Fabric | Microsoft Learn
We are running into two issues:
My thoughts on applicability to OneLake:
Would anyone have a proposed solution for these 2 items? Thanks!
Azure Data Factory or Synapse pipelines does not support OneLake connection yet.
What scenario do you wanna achieve with OneLake connection? Whether you can use OneLake short-cut or Fabric pipeline to achieve?
"Azure Data Factory or Synapse pipelines does not support OneLake connection yet" Can you please provide an official Microsoft documentation link for this statement?
Certainly one can connect from either platform by for example integrating via PS per instructions at Manage OneLake with PowerShell - Microsoft Fabric | Microsoft Learn. Or by routing via Spark notebooks or serverless SQL per instructions at Integrate OneLake with Azure Synapse Analytics - Microsoft Fabric | Microsoft Learn We are looking for a more direct way though.
If support is in fact lacking then what's the roadmap on plugging the gap? OneLake is advertized everywhere as compatible with everything that's using ADLS Gen 2 APIs. If that's not technically accurate it's important for customers to know.
"What scenario do you wanna achieve with OneLake connection? Whether you can use OneLake short-cut or Fabric pipeline to achieve?" That's described in detail in the original post, especially the 2nd paragraph. Is there an additional clarification you are seeing? If so, can you please be more specific around the extra info you require?
If supported, will be listed in public doc Connector overview - Azure Data Factory & Azure Synapse | Microsoft Learn
This is in roadmap. While it will be great if you can submit an idea which will help priorization.
Thank you!
@GraceGuIs there a timeline on that item that you could please share?
If support is lacking we need to jump through additional architectural hoops such as provisioning extra ADLS Gen 2 accounts. That somewhat defeats the purpose of Fabric & OneLake as the consolidated analytics platform.
I've added the idea to https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=95965f46-0815-ee11-a81c-000d3a047196 per your suggestion. IMO it's a really fundamental feature.
Consider What is OneLake? - Microsoft Fabric | Microsoft Learn:
"OneLake supports the same ADLS Gen2 APIs and SDKs to be compatible with existing ADLS Gen2 applications..."
Or OneLake access and APIs - Microsoft Fabric | Microsoft Learn:
"You can access your data in OneLake through any tool compatible with ADLS Gen2 just by using a OneLake URI instead."
Many other OneLake documents make similar claims, which are technically incorrect according to your note above. It's rather strange for OneLake support to be missing in the #1 deployed ETL/ELT platform in Azure.
I think documentation needs to be amended in this area. It's currently vastly overselling OneLake as a service that's fully backward-compatible with ADLS Gen2. The reality is rather different.
Thanks for the feedback. We will prioritize this, tentative ETA is Q3 this year.
Thanks for the tentative timeline.
Can documentation please be fixed in the meantime? I provided multiple examples of erroneous documentation statements above. It's very challenging to design & build Fabric solutions in the absence of solid docs.