The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Is it possible to essentially mount a network drive for use in a Fabric notebook? I've got a python script that I'm running on Windows that uses Windows network shares to grab data for manipulation. This is trivial. I'm trying to move the script to a Fabric notebook, but I'm not sure how to access the files I need from the network drive from within the notebook. Is this possible?
Solved! Go to Solution.
Hi @diablo908 - Fabric notebooks are designed to operate in a cloud-based environment, which does not have direct access to your on-premises network drives or shared folders.you can copy your data from the network drive to an Azure File Share or Azure Blob Storage.
Use Fabric notebook to connect to the Azure storage account and access the data using libraries like azure-storage-blob or azure-storage-file-share.
If you can upload the files to the Fabric Lakehouse or OneLake storage, you can then access them directly within the Fabric notebook.
FYR:
Load data from Azure Blob Storage into Python
Develop for Azure Files with Python - Azure Storage | Microsoft Learn
Proud to be a Super User! | |
Hi @diablo908 ,
You can bring data from your network drive to lake house using dataflow gen2 and you can load the data to the data frame from lake house.
Or you can pull data to a semantic model and publish it to premium capacity workspace and you can use semantic link to pull the data.
I hope it will be helpful.
Thanks,
Sai Teja
Hi @diablo908 , Hope your issue is solved. If it is, please consider marking the answer 'Accept as solution', so others with similar issues may find it easily. If it isn't, please share the details.
Thank you.
Hi @diablo908 , Hope your issue is solved. If it is, please consider marking it 'Accept as solution', so others with similar issues may find it easily. If it isn't, please share the details. Thank you.
Hi @diablo908 ,
You can bring data from your network drive to lake house using dataflow gen2 and you can load the data to the data frame from lake house.
Or you can pull data to a semantic model and publish it to premium capacity workspace and you can use semantic link to pull the data.
I hope it will be helpful.
Thanks,
Sai Teja
Hi @diablo908 , Thank you for reaching out to the Microsoft Fabric Community Forum.
The answer provided by @rajendraongole1 is correct.
Please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If you have any more queries regarding this issue, please share the details.
Thank you.
Hi @diablo908 - Fabric notebooks are designed to operate in a cloud-based environment, which does not have direct access to your on-premises network drives or shared folders.you can copy your data from the network drive to an Azure File Share or Azure Blob Storage.
Use Fabric notebook to connect to the Azure storage account and access the data using libraries like azure-storage-blob or azure-storage-file-share.
If you can upload the files to the Fabric Lakehouse or OneLake storage, you can then access them directly within the Fabric notebook.
FYR:
Load data from Azure Blob Storage into Python
Develop for Azure Files with Python - Azure Storage | Microsoft Learn
Proud to be a Super User! | |