Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
hsn367
Helper I
Helper I

How to read data from Fabric lakehouse in AzureML

I am currently trying to load data from Fabric Lakehouse through notebook in AzureML workspace but can't figure out how to do this. As per AzureML SDK V2, we can create onelake datastore in AzureML.

 

But is there any other way we can read data from fabric lakehouse in AzureML. For instance we can read data from snowflake without creating a datastore, is there a similar way to read Fabric data in AzureML.

4 REPLIES 4
CraigRG
New Member

I also have been trying to do this via the ML Studio UI.  I'm new to the ML Studio and Fabric services, so please pardon my naivety and provide better information if you find errors in the below - thanks!

 

Today, I found a "Connections" option in the left-hand Manage menu and was surprised I could create a connection to OneLake.  I provided my OneLake Lakehouse DFS URL, which I had previously and used in Storage Explorer to connect to my lakehouse and selected appropriate authentication (e.g. Entra ID).  

 

After some time, a Datasource appeared (in the Datasource tab of the Data UI) that reference the connection.  When I clicked on this Datasource, I could see what was in my Lakehouse.  I could Browse the Lakehouse even.  When I hovered over a folder than contained a delta table, I could select the ellipsis (...) and choose option to create a data asset!  

 

This fulfilled my need for "manual" interfacing, and I'm sure someone more knowledgeable than me could expand on how to use this for automating integrations at enterprise scale as well.

hsn367
Helper I
Helper I

Hi @Anonymous Thanks for the reply.

 

The script that you have provided is provided in the AzureML docs here, actually I had tried that script before but the problem is that it does not work.

 

Upon importing the necessary packages, I get the error:

 

CODE

from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
from azure.ai.ml.entities import OneLakeDatastore, OneLakeArtifact

 

ERROR

 

datastore_issue.png

 

VERSION:

I am using version 1.17.0 of azure-ai-ml (This is the latest version)

 

datastore_issue_library.png

 

That's the reason I asked in my question that is there any other way of loading the fabric data in azureml other than by creating datastores.

Anonymous
Not applicable

Hi @hsn367 ,

Yes, there are other ways to load structured data in AzureML besides creating a datastore:

  1. Referencing data in an Azure Blob store directly using a URI referencing a storage location on the local computer, Azure storage, or a publicly available HTTP(S) location.
  2. Import data directly from your local computer or an existing cloud-based storage resource without creating a datastore.

Are you currently loading data successfully and if so which method are you using?

Best Regards,

Ada Wang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly

 

Anonymous
Not applicable

Hi @hsn367 ,

We can very quickly introduce Microsoft Fabric Lakehouse data to Azure Machine Learning Service through a short script. You can follow the steps below:

1.Select Files >Properties in Fabric Lakehouse and copy ABFS path.

vyifanwmsft_0-1720071005652.png

vyifanwmsft_1-1720071175780.png

2.Create a new Notebook in your local machine. Execute the following code to import Lakehouse data into Azure Machine Learning Service.

! pip install azure-ai-ml -U 
! pip install mltable azureml-dataprep[pandas] -U 
! pip install azureml-fsspec -U

from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
from azure.ai.ml.entities import OneLakeDatastore, OneLakeArtifact

subscription_id = "Your Azure Subscription ID" 
resource_group = "Your Azure Machine Learning Service Workspace Resource Group" 
workspace = "Your Azure Machine Learning Service Workspace Name"

ml_client = MLClient(
    DefaultAzureCredential(), subscription_id, resource_group, workspace
)

artifact = OneLakeArtifact(
    name=<Lakehouse ID>, 
    type="lake_house"
)
store = OneLakeDatastore(
    name="onelake_lh_for_azureml",
    description="Credential-less OneLake datastore.",
    endpoint="msit-onelake.dfs.fabric.microsoft.com",
    artifact=artifact,
    one_lake_workspace_name=<One Lake workspace name>,
)

ml_client.create_or_update(store)

 For more information, you can see the blog below:
Using Microsoft Fabric’s Lakehouse Data and prompt flow in Azure Machine Learning Service to create ...

 

Best Regards,

Ada Wang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.