Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Summary –
Code:
# Install the correct packages first in the same folder as this file.
# pip install azure-storage-file-datalake azure-identity
from azure.storage.filedatalake import DataLakeServiceClient, FileSystemClient
from azure.identity import DefaultAzureCredential
# Set your account, workspace, and item path here
ACCOUNT_NAME = "onelake"
WORKSPACE_NAME = "FABRIC MVP 1 "
DATA_PATH = "/LH_MVP_1.lakehouse/Files/Processed"
def main():
# Create a service client using the default Azure credential
account_url = fhttps://{ACCOUNT_NAME}.dfs.fabric.microsoft.com
token_credential = DefaultAzureCredential()
service_client = DataLakeServiceClient(account_url, credential=token_credential)
# Create a file system client for the workspace
file_system_client = service_client.get_file_system_client(WORKSPACE_NAME)
# List a directory within the filesystem
paths = file_system_client.get_paths(path=DATA_PATH)
for path in paths:
print(path.name + '\n')
if __name__ == "__main__":
main()
Anyone facing similar issue and suggestions to solve for this please?
Solved! Go to Solution.
Hello,
I have found the cause of it, and it has to do with the nomenclature you follow for naming your lakehouse, workspace and all subsequent objects. They should not have upper case letters or special characters. That solved the problem for us!
There is this post in this link - How to use service principal authentication to access Microsoft Fabric's OneLake (dataroots.io). Can you help me understand if there are these settings at the tenant level that need to be enabled for the service principles to interact appropriately with the OneLake?
Hi @tajjallarukshan ,
The error message "azure.core.exceptions.ResourceNotFoundError:Artifact'lakehouse' is not found in workspace 'MVP'
ErrorCode:ArtifactNotFound" indicates that the Lakehouse project cannot be found in the specified workspace.
Please check the following aspects:
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi Yang,
I had updated the code posted here with different names to maintain integrity of the organization.
I have validated the names for any typos or additional space, but thats not the case.
Copying workspace name from the portal and ensuring to copy name of lakehouse as is has been my approach. So there is no chance of such misses.
However, Here is my detailed analysis upon figuring out where the code is creating a problem:
So this shows:
1. Authentication is successful
2. All workspaces are being listed as file systems successfully
3. But error appears while attempting to list paths inside a specific workspace
Do you know if the format in which the data path is passed is correct here "file_system_client.get_paths(path=DATA_PATH)"?
My use case is to upload files to the lakehouse via Python. I followed the approach in this link, and I was able to successfully list/ upload and update files in lakehouse 2 months ago. But the same code abruptly stopped working just after a week of its successful run. I have a ticket with Microsoft for the same, but its not heading in any direction to get this working.
Hi,
have you found a solution to this problem? We experiencing exactly the same thing. Our Python Scripts suddenly stop working without changing anything on our site.
Hello,
I have found the cause of it, and it has to do with the nomenclature you follow for naming your lakehouse, workspace and all subsequent objects. They should not have upper case letters or special characters. That solved the problem for us!
Hi @tajjallarukshan, I'm having the same problem. Even after your comment I tried to create another lakehouse with lower case letters and not special characters and still unable to access or list the paths. I am using the client id, secret and tenant id for the authentication. I am able to print file system name as you are trying in the code and it is working fine. But once I try to upload any file or try to list path then I am getting the same error. Can you please help me out here
Hello,
My case involved capital letters and spaces in the nomenclature of lakehouse and also the workspace. When I converted everything to lower case without any spaces, it worked absolutely fine.
This must be with nomenclature only - check for spaces, capital letters and special characters and ensure you dont use these.
Try creating a new workspace and new lakehouse, add this service principle to the workspace with contributr rights and try again.
Hi,
what I have found out is, that whitespaces,* and | are evil. Do not use them in the workspace and lakehouse name. I couldnt find a clear rule. Even some names that are compliant with ADLS2's naming reference broke for me - naming-and-referencing-containers--blobs--and-metadata.
User | Count |
---|---|
6 | |
2 | |
2 | |
2 | |
2 |
User | Count |
---|---|
18 | |
17 | |
6 | |
5 | |
4 |