Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.

Reply
throwserror
New Member

Upload CSV from Powershell to Lakehouse

Im running in some trouble with permissions, trying to upload some CSV to a lakehouse.

Install-Module -Name Az -AllowClobber -Scope CurrentUser

$TenantId = "xx"
$ClientId = "xx"
$ClientSecret = "xx"

$AuthUrl = "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token"
$Body = @{
client_id = $ClientId
scope = "https://storage.azure.com/.default" # Required for OneLake API
client_secret = $ClientSecret
grant_type = "client_credentials"
}

# Get Access Token
$TokenResponse = Invoke-RestMethod -Method Post -Uri $AuthUrl -Body $Body -ContentType "application/x-www-form-urlencoded"
$AccessToken = $TokenResponse.access_token
Write-Host "Access Token: $AccessToken"
$OneLakeUri = "https://onelake.dfs.fabric.microsoft.com/aaaaa/bbbbb/Files"

Invoke-WebRequest -Method GET -Uri $OneLakeUri -Headers @{
Authorization = "Bearer $AccessToken"
}

Define variables
$WorkspaceId = "aaaa"  # The Fabric workspace ID
$LakehouseName = "lh_dev_dwh"  # The name of your lakehouse
$FolderPath = "Files"  # Target folder in OneLake to upload the file to
$FileName = "Daily_Report_XXX.csv"  # File to upload
$LocalFilePath = "C:\taod_export\Daily_Report_XXX.csv"  # Path to the file on your local machine

# OneLake API endpoint
#$OneLakeUri = "https://$WorkspaceId.lakehouse.fabric.microsoft.com/$LakehouseName/$FolderPath/$FileName"
$OneLakeUri = "https://onelake.dfs.fabric.microsoft.com/aaaa/bbbb/Files/Daily_Report_XXX.csv" 



# Read the file content
$FileContent = Get-Content -Path $LocalFilePath -Raw -Encoding Byte

# Upload the file
Invoke-RestMethod -Method Put -Uri $OneLakeUri -Headers @{ 
    "Authorization" = "Bearer $AccessToken" 
    "Content-Type"  = "application/octet-stream"
} -Body $FileContent

 

This is what I am trying, and i hope this is the right way to do it. Please let me know if there is an easier way to do it via Powershell.

My main Problem right now is, even tho I put the Storage Blob Data Contributor and the Storage Account Contributer in my subscritpion to my AD App and the underlying Fabric ressource is inherting it, when I decode my access token at jwt.io i wont get any "role". So with this missing permissions my script wont work. I tried GPTing and I am now running in circles. Does anyone have a better approach or an idea of what to test next? 

Best 

throwserror 

2 ACCEPTED SOLUTIONS
anilgavhane
Resolver III
Resolver III

@throwserror  

What’s Going Wrong

You're using client credentials flow to get an access token for OneLake, but you're not seeing any roles in the token payload. That’s expected behavior for this flow unless you're using role-based access control (RBAC) via Microsoft Entra ID (formerly Azure AD) and have explicitly assigned app roles or API permissions to your app registration.

The key issue: Storage Blob Data Contributor and Storage Account Contributor are Azure RBAC roles, but OneLake in Microsoft Fabric doesn't use traditional Azure Storage RBAC directly. Instead, it uses Microsoft Fabric-specific permissions and Entra ID scopes.

 

Recommended Fixes and Alternatives

1. Use Connect-AzAccount Instead of Client Credentials

If you're just uploading files manually via PowerShell, the simplest route is to use interactive login:

 

Install-Module Az.Storage -Force Connect-AzAccount $ctx = New-AzStorageContext -StorageAccountName "onelake" -UseConnectedAccount -Endpoint "fabric.microsoft.com" Set-AzDataLakeGen2ItemContent -Context $ctx ` -FileSystem "" ` -Path ".lakehouse/Files/Daily_Report_XXX.csv" ` -Source "C:\taod_export\Daily_Report_XXX.csv"



 

This avoids the whole token dance and uses your signed-in identity, which already has access to the workspace.

Reference: Microsoft’s PowerShell guide for OneLake

 

2. If You Must Use Client Credentials

Then you need to:

  • Register your app in Microsoft Entra ID
  • Assign API permissions like Storage delegated access or Files.ReadWrite.All
  • Use delegated permissions if possible (requires user context)
  • Or configure app roles and assign them to your app

But this is complex and not well-supported for OneLake yet. The REST API is still evolving, and many operations expect user context, not app-only tokens.

 

3. Use AzCopy with Entra ID Auth

If you're automating uploads, consider using AzCopy with Entra ID. It supports OneLake and handles token acquisition and permissions more gracefully.

 

What to Test Next

  • Try switching to Connect-AzAccount and New-AzStorageContext for a quick win
  • If you must use client credentials, inspect the token’s scp (scope) claim—not just roles
  • Check whether the Fabric workspace has granted access to your app via Microsoft Fabric Admin Portal
  • Use the Purple Frog PowerShell script as a reference—it’s tailored for OneLake uploads

 

 

View solution in original post

v-veshwara-msft
Community Support
Community Support

Hi @throwserror ,

Thanks for posting in Microsoft Fabric Community.

As @anilgavhane  already mentioned, the behavior you see with the token not containing roles is expected when using client credentials. What matters for OneLake is not Azure Storage RBAC but permissions within the Fabric workspace. The service principal must be added as a Member, Contributor, or Admin in the Fabric workspace itself, since Fabric manages its own access model and does not honor Azure RBAC roles.

 

Also, when requesting the token, please make sure the scope is set to https://onelake.dfs.fabric.microsoft.com/.default instead of https://storage.azure.com/.default.

Using the storage endpoint scope is valid for Azure Storage accounts, but OneLake requires its own scope.

 

For more details on SPNs: Service Principals in Fabric Data Warehouse - Microsoft Fabric | Microsoft Learn

 

In addition, for service principal access to work, the following tenant settings need to be enabled by a Fabric admin:

  1. Service principals can call Fabric public APIs

  2. Users can access data stored in OneLake with apps external to Fabric

Hope this helps. Please reach out for further assistance.

Thank you.

 

Also thanks to @anilgavhane for detailed explanation and for sharing useful resources.

View solution in original post

6 REPLIES 6
v-veshwara-msft
Community Support
Community Support

Hi @throwserror ,

May I ask if the solutions provided have addressed your needs?

If you need any further assistance, feel free to reach out.

Thank you.

Also thanks @spaceman127 for sharing your valuable suggestions.

spaceman127
Resolver II
Resolver II

Hi @throwserror,

 

You can also use a fabric notebook.

I always use notebooks for this because I find them easier to handle. It's another approach, and you might like it.

You can find an example in my GitHub repository. I use it to retrieve cost data, but some of it can be adopted for other purposes. An Azure Key Vault can also be integrated.

 

Click here for the repository.

 

https://github.com/renefuerstenberg/Microsoft_Fabric/tree/main/fabric_nb_code_get_azure_cost_data_ex...

 

An other option is to use a Data Pipeline and us a Copy Activity when a Notebook is oversized. Its easy to create it.

Steps:

1. Connect you local environment with Data Gateway

2. Create a Data Pipeline and use a Copy Activity

 

If you need any further help, please let me know.

 

Best regards

v-veshwara-msft
Community Support
Community Support

Hi @throwserror ,
We wanted to kindly follow up regarding your query. If you need any further assistance, please reach out.
Thank you.

v-veshwara-msft
Community Support
Community Support

Hi @throwserror ,

Thanks for posting in Microsoft Fabric Community.

As @anilgavhane  already mentioned, the behavior you see with the token not containing roles is expected when using client credentials. What matters for OneLake is not Azure Storage RBAC but permissions within the Fabric workspace. The service principal must be added as a Member, Contributor, or Admin in the Fabric workspace itself, since Fabric manages its own access model and does not honor Azure RBAC roles.

 

Also, when requesting the token, please make sure the scope is set to https://onelake.dfs.fabric.microsoft.com/.default instead of https://storage.azure.com/.default.

Using the storage endpoint scope is valid for Azure Storage accounts, but OneLake requires its own scope.

 

For more details on SPNs: Service Principals in Fabric Data Warehouse - Microsoft Fabric | Microsoft Learn

 

In addition, for service principal access to work, the following tenant settings need to be enabled by a Fabric admin:

  1. Service principals can call Fabric public APIs

  2. Users can access data stored in OneLake with apps external to Fabric

Hope this helps. Please reach out for further assistance.

Thank you.

 

Also thanks to @anilgavhane for detailed explanation and for sharing useful resources.

Hi @throwserror ,
Just wanted to check if the responses provided were helpful. If further assistance is needed, please reach out.
Thank you.

anilgavhane
Resolver III
Resolver III

@throwserror  

What’s Going Wrong

You're using client credentials flow to get an access token for OneLake, but you're not seeing any roles in the token payload. That’s expected behavior for this flow unless you're using role-based access control (RBAC) via Microsoft Entra ID (formerly Azure AD) and have explicitly assigned app roles or API permissions to your app registration.

The key issue: Storage Blob Data Contributor and Storage Account Contributor are Azure RBAC roles, but OneLake in Microsoft Fabric doesn't use traditional Azure Storage RBAC directly. Instead, it uses Microsoft Fabric-specific permissions and Entra ID scopes.

 

Recommended Fixes and Alternatives

1. Use Connect-AzAccount Instead of Client Credentials

If you're just uploading files manually via PowerShell, the simplest route is to use interactive login:

 

Install-Module Az.Storage -Force Connect-AzAccount $ctx = New-AzStorageContext -StorageAccountName "onelake" -UseConnectedAccount -Endpoint "fabric.microsoft.com" Set-AzDataLakeGen2ItemContent -Context $ctx ` -FileSystem "" ` -Path ".lakehouse/Files/Daily_Report_XXX.csv" ` -Source "C:\taod_export\Daily_Report_XXX.csv"



 

This avoids the whole token dance and uses your signed-in identity, which already has access to the workspace.

Reference: Microsoft’s PowerShell guide for OneLake

 

2. If You Must Use Client Credentials

Then you need to:

  • Register your app in Microsoft Entra ID
  • Assign API permissions like Storage delegated access or Files.ReadWrite.All
  • Use delegated permissions if possible (requires user context)
  • Or configure app roles and assign them to your app

But this is complex and not well-supported for OneLake yet. The REST API is still evolving, and many operations expect user context, not app-only tokens.

 

3. Use AzCopy with Entra ID Auth

If you're automating uploads, consider using AzCopy with Entra ID. It supports OneLake and handles token acquisition and permissions more gracefully.

 

What to Test Next

  • Try switching to Connect-AzAccount and New-AzStorageContext for a quick win
  • If you must use client credentials, inspect the token’s scp (scope) claim—not just roles
  • Check whether the Fabric workspace has granted access to your app via Microsoft Fabric Admin Portal
  • Use the Purple Frog PowerShell script as a reference—it’s tailored for OneLake uploads

 

 

Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors