The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I created a SAS token for Fabric but it gives an error when I try to use it.
I am unsure as to why is the request expecting a bearer token while using SAS ?
Here are the parameters used in the request
Hi @SachinNandanwar, I see some code in your blog that needs to be changed, could you please make following changes to your code and try again?
DataLakeSasBuilder sasBuilder = new DataLakeSasBuilder()
{
FileSystemName = workspaceName,
Resource = "d",
IsDirectory = true,
Path = lakehouseName, // Should be {your lakehouse}.Lakehouse/Files
StartsOn = _keyStartTime,
ExpiresOn = _keyExpiryTime
};
Hi @Yao-MSFT ,
I am using onelake.dfs endpoints. See the belwo screenshot of the code from my blog where I set the endpoint values to a variable.
Also, If you check my earlier screenshots of POST MAN in this post, you can see that I am using dfs endpoints.
and my delegation expiration is set to an hour from the time the SAS token is generated
The SAS token expiry is also set to an hour
After making your suggested changes its still the same error
I was having this same issue for a few days trying to figure out what the problem is. I finally got it to work in Python with the following code:
# Requires 'pip install azure-storage-file-datalake azure-identity'
from azure.storage.filedatalake import (
DataLakeServiceClient,
generate_directory_sas,
)
from azure.identity import DefaultAzureCredential
from datetime import datetime, timedelta
import pytz
def get_user_delegation_sas(workspace: str, data_path: str) -> str:
"""
Generates a User Delegation SAS token for accessing a Data Lake in Azure.
"""
token_credential = DefaultAzureCredential()
service_client = DataLakeServiceClient(
account_url="https://onelake.dfs.fabric.microsoft.com",
credential=token_credential
)
# Get a user delegation key that's valid for 30 minutes
delegation_key_start_time = (datetime.now(pytz.utc) - timedelta(minutes=5)).replace(second=0, microsecond=0)
delegation_key_expiry_time = delegation_key_start_time + timedelta(minutes=30)
user_delegation_key = service_client.get_user_delegation_key(
key_start_time=delegation_key_start_time,
key_expiry_time=delegation_key_expiry_time,
)
return generate_directory_sas(
account_name="onelake",
file_system_name=workspace,
directory_name=data_path,
credential=user_delegation_key,
permission="racwdl",
expiry=delegation_key_expiry_time,
start=delegation_key_start_time,
)
if __name__ == "__main__":
ACCOUNT_URL = "https://onelake.dfs.fabric.microsoft.com"
WORKSPACE_NAME = "<workspace name>"
LAKEHOUSE_NAME = "<Lakehouse name>"
root_path = f"{LAKEHOUSE_NAME}.Lakehouse"
data_path = f"{root_path}/Files"
# SAS token has permission on lakehouse root path, could be more specific
sas_token = get_user_delegation_sas(WORKSPACE_NAME, root_path)
print("SAS token: ", sas_token)
# Try to use sas token to list files
service_client = DataLakeServiceClient(ACCOUNT_URL, credential=sas_token)
file_system_client = service_client.get_file_system_client(WORKSPACE_NAME)
paths = file_system_client.get_paths(path=data_path, recursive=True)
for path in paths:
print(path.name + "\n")
My main issue was that I used the endpoint in the generation and signing of the SAS token, but the right answer is to use "onelake" only. I have also been trying to use lakehouse and workspace Item IDs, but I only got it to work with names.
Hi @SachinNandanwar - I'm happy to help here. The reason you are receiving the error message regarding a bearer token is because, since all OneLake SAS are user-delegated, the SAS is authenticated very similarly to how a bearer token would, hence the similar error.
From reviewing the samples above, your issue might be with the signedPermissions ('sp') field - it looks like you're only granting the SAS Write and List permissions, which means the SAS won't have permissions to Read, hence the authorization error. Could you try adjusting the signedPermissions and trying again?
Hello @Anonymous and @SachinNandanwar ,
I am currently stuck on the same error and was wondering whether you have found a solution for the same? Any help would be greatly appreciated. Thank you.
Nope..Not yet..
Its been over a month since I have blogged on the issue but there hasnt been any solution insight
https://www.azureguru.net/sas-token-in-fabric#heading-the-issue
I even tried the Fabric Reditt forum , but to no avail
https://www.reddit.com/r/MicrosoftFabric/comments/1g0m0sy/comment/lrf41kl/?context=3
I went through the following documentation. It says OneLake SAS can grant access to files and folders within data items like lakehouses, but it doesn't mention that it requires a combination of a SAS token and a bearer token.
Create a OneLake shared access signature (SAS) (Preview) - Microsoft Fabric | Microsoft Learn
Typically, a SAS token should be sufficient for accessing the resources, as it includes the necessary permissions and signature for authentication. But to troubleshoot this further, you might try including a SAS token and a bearer token in your request and testing whether this would make the call work. You might test this first and let us know the result. Thank you in advance.
Best Regards,
Jing
Community Support Team
@Anonymous I tried using a bearer token hoping that bearer token authenticates the request and with all the information in the SAS token the request is authroized.
But unfortunately this isnt the case.Have a look at the screengrab.
Its the bearer token that takes the precedence in the request with SAS token being completely ignored.
I used Azurecmdlet Powershell to create the bearer token.
Connect-AzAccount
$My_Token = Get-AzAccessToken -ResourceTypeName Storage
$My_Token.Token | Set-Clipboard
Thanks @Anonymous .
Are you aware what endpoints should I use ? For Fabric API's the endpoint used for bearer token is https://api.fabric.microsoft.com/.default
For DFS I tried using https://onelake.dfs.fabric.microsoft.com/.default but this isnt working.