Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
KimTutein
Advocate II
Advocate II

Using notebookutils.fs to copy data from blob storage using private endpoints

Hi community

 

Has anyone tried using notebookutils.fs to copy data from a Azure blob storage gen2 using private endpoint. I have made a notebook that will backup my lakehouse data to a container in blob storage and then later moving it into a lakehouse (this as a backup/restore using blob and immutable blob containers). I have no problem if I use a storage account which is open to all network however if I close for public network and setup private endpoint on blob and dfs subtypes I am only able to copy data to blob storage from Fabric notebook (se appendix 1) but not the other way around (from blob storage to Lakehouse) - see appendix 2. The error says authentication error but if I open up network again there is no problem so the problem is network related.

 

Anyone has any input to why I cannot copy data from blob storage to lakehouse when using private endpoints? Any input/help will be much appreciated.

 

Appendix 1 --  From lake to blob => DOES work with private endpoint

from notebookutils import fs

 

source_path_files = "abfss://ds_data_test@onelake.dfs.fabric.microsoft.com/InvestLakeRestore2.Lakehouse/Files/temp/ktt%20tester.txt"

destination_path = "abfss://temp@ktttest3.blob.core.windows.net/Files/"

 

fs.fastcp(src=source_path_files, dest =  destination_path, recurse=True)    

 

 

 

Appendix 2 - From blob to lake => Does NOT work with private endpoint

from notebookutils import fs

 

source_path_files = "abfss://temp@ktttest3.blob.core.windows.net/Files/"

destination_path = "abfss://ds_data_test@onelake.dfs.fabric.microsoft.com/InvestLakeRestore2.Lakehouse/Files/temp/ktt%20tester.txt"

 

fs.fastcp(src=source_path_files, dest =  destination_path, recurse=True)

 

The error:

Py4JJavaError: An error occurred while calling z:notebookutils.fs.fastcp. : java.lang.Exception: azcopy failed, cmd:bash -c azcopy copy 'https://ktttest3.blob.core.windows.net/temp/Files/' 'https://onelake.blob.fabric.microsoft.com/ds_data_test/InvestLakeRestore2.Lakehouse/Files/temp/ktt tester.txt' --trusted-microsoft-suffixes="*.pbidedicated.windows.net;*.pbidedicated.windows-int.net;*.fabric.microsoft.com" --recursive --skip-version-check exit:1 stdout:INFO: Scanning... INFO: AZCOPY_OAUTH_TOKEN_INFO is set. INFO: Autologin not specified. INFO: Authenticating to destination using Azure AD INFO: Authenticating to source using Azure AD INFO: Any empty folders will not be processed, because source and/or destination doesn't have full folder support Job 34f1d337-8993-574c-6fba-d528a6723871 has started Log file is located at: /home/trusted-service-user/.azcopy/34f1d337-8993-574c-6fba-d528a6723871.log INFO: Transfers could fail because AzCopy could not verify if the destination supports tiers. INFO: Authentication failed, it is either not correct, or expired, or does not have the correct permission PUT https://onelake.blob.fabric.microsoft.com/ds_data_test/InvestLakeRestore2.Lakehouse/Files/temp/ktt tester.txt/Files/ktt tester.txt -------------------------------------------------------------------------------- RESPONSE 403: 403 Forbidden ERROR CODE: CannotVerifyCopySource -------------------------------------------------------------------------------- <?xml version="1.0" encoding="utf-8"?><Error><Code>CannotVerifyCopySource</Code><Message>This request is not authorized to perform this operation. RequestId:a3b3076e-501e-0039-65cf-db61fb000000 Time:2025-06-12T19:20:49.3537228Z</Message></Error> --------------------------------------------------------------------------------

1 ACCEPTED SOLUTION
g3kuser
Helper I
Helper I

You can create shortcut for private storage account. In the storage account you will have to whitelist the Fabric workspace under the resource type section in network/firewall settings and allow Azure service traffic. I am not sure if you can have it set to root. 

The other thing you can try is to mount the storage container through notebookutils library and see if that goes through.

View solution in original post

4 REPLIES 4
g3kuser
Helper I
Helper I

You can create shortcut for private storage account. In the storage account you will have to whitelist the Fabric workspace under the resource type section in network/firewall settings and allow Azure service traffic. I am not sure if you can have it set to root. 

The other thing you can try is to mount the storage container through notebookutils library and see if that goes through.

Hi @g3kuser 

 

The thing I got to work was to mount the blobstorage and refrence the files via. this mount. 

 

Thank you for your help/input

g3kuser
Helper I
Helper I

Have you tried by creating shortcut for ADLS Gen 2 storage and perform copy operation through shortcut URL instead of ABFSS Gen2 path. 

Hi @g3kuser 

Thank you for your idea.

 

However making a shortcut does not seem to work when the network is closed for public network and using private endpoint. One more thing is that you cannot make a shortcut at the root of the account - the root has to be a container (and our containers are made on the fly.

 

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.