Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
spartan27244
Resolver I
Resolver I

Uploading Files using .Net in C#

I am attempting to upload file using the Azure.Storage.Files.DataLake library from NuGet. I am getting am error that I cannot make sense of. Here is what I have...

DefaultAzureCredential credential = new();
DataLakeServiceClient dlServClient = new(new Uri(sLakeURL), credential);
DataLakeFileSystemClient dlFileSysClient = dlServClient.GetFileSystemClient("my-workspace");
DataLakeDirectoryClient dlToFolder = dlFileSysClient.GetDirectoryClient("my-datalake");

 

DataLakeFileClient file = dlToFolder.GetFileClient("File/Subfolder1/SubFolder2/MyFile.txt");
file.Upload("MyFile.txt", overwrite:true);

 

I get this error "Request Failed with Required item type extension is missing in the item name. Expected format {itemname}.{itemtype}". But, obviously the filename has an extension on it.

 

2 ACCEPTED SOLUTIONS
spartan27244
Resolver I
Resolver I

For now I have aboned this project since logging in using a User Id and a Passward without MFA. Is a royal pain.

View solution in original post

Hi @spartan27244

Thanks for the confirmation, we understand your pain in this scenario. I’d encourage you to submit your detailed feedback and ideas via Microsoft's official feedback channels, such as the Microsoft Fabric Ideas. Feedback submitted here is often reviewed by the product teams and can lead to meaningful improvement.

 

 

 

 

Thanks,

Prashanth Are

MS Fabric community support

View solution in original post

7 REPLIES 7
v-prasare
Community Support
Community Support

We are following up once again regarding your query. Could you please confirm if you have shared ideas with Microsoft?

If the you have any progress, we kindly request you to share the resolution or key insights here to help others in the community. If we don’t hear back, we’ll go ahead and close this thread.

Should you need further assistance in the future, we encourage you to reach out via the Microsoft Fabric Community Forum and create a new thread. We’ll be happy to help.

 

Thank you for your understanding and participation.

I have not done any followup with Microsoft, I really don't know how, but I have just tabled this project. At this point we are evaluating if fabric is even something we want to continue to pursue. 

spartan27244
Resolver I
Resolver I

For now I have aboned this project since logging in using a User Id and a Passward without MFA. Is a royal pain.

Hi @spartan27244

Thanks for the confirmation, we understand your pain in this scenario. I’d encourage you to submit your detailed feedback and ideas via Microsoft's official feedback channels, such as the Microsoft Fabric Ideas. Feedback submitted here is often reviewed by the product teams and can lead to meaningful improvement.

 

 

 

 

Thanks,

Prashanth Are

MS Fabric community support

v-prasare
Community Support
Community Support

Hi @spartan27244,

Although it seems to indicate a file extension issue, this message is often returned when intermediate directories in the file path do not exist and haven't been explicitly created before attempting the upload. Azure Data Lake Gen2 does not automatically create nested directories

 

above dlToFolder is pointing to "my-datalake", and then you're asking it to upload a file under a nested path "File/Subfolder1/SubFolder2/MyFile.txt" without creating those folders (File, Subfolder1, SubFolder2) explicitly.


Can you try making below changes to code and let me if your able to resolve your issue:

DefaultAzureCredential credential = new();
DataLakeServiceClient dlServClient = new(new Uri(sLakeURL), credential);

// Get the file system client
DataLakeFileSystemClient dlFileSysClient = dlServClient.GetFileSystemClient("my-workspace");

// Create/get the root directory (e.g., "my-datalake")
DataLakeDirectoryClient dlRootFolder = dlFileSysClient.GetDirectoryClient("my-datalake");
await dlRootFolder.CreateIfNotExistsAsync(); //

// Create the intermediate subdirectories
DataLakeDirectoryClient level1 = await dlRootFolder.CreateSubDirectoryAsync("File");
DataLakeDirectoryClient level2 = await level1.CreateSubDirectoryAsync("Subfolder1");
DataLakeDirectoryClient level3 = await level2.CreateSubDirectoryAsync("SubFolder2");

// Get the file client at the final path
DataLakeFileClient file = level3.GetFileClient("MyFile.txt");

// Upload the file
await file.UploadAsync("MyFile.txt", overwrite: true);

 

 

 

Thanks,

Prashanth Are

MS Fabric community support

The destination folders do already exist. However I am not quite sure where the root begins. The Data Lake already creates Tables and Files folders. I am not sure when I connect to I start with "Files/Subfolder1/Subfolder2/Filename.txt" or "Subfolder1/Subfolder2/Filename.txt". I'm sure I have tried both.

 

could it also be the cred, I am working on a machine that is not part of the Entra domain where the lakehouse resides. I have a service Id and pwd that does not require MFA to use but I cannot figure out how to do it.

Hi @spartan27244,

 

When you connect using a DataLakeServiceClient or DataLakeFileSystemClient, hierarchy starting at the container level (i.e., the Fabric Lakehouse’s Files or Tables container). If you're working in the Files area, you must start your path with "Files/"

"Files/Subfolder1/Subfolder2/MyFile.txt"

 

If you're working on a machine that’s not Entra-joined and have a service principal (client ID and secret) that doesn’t require MFA, the best option is to use the ClientSecretCredential class. This explicitly authenticates using your app registration details in Azure.

using Azure.Identity;
using Azure.Storage.Files.DataLake;

// Replace with your actual values
string tenantId = "<your-tenant-id>";
string clientId = "<your-client-id>";
string clientSecret = "<your-client-secret>";
string lakeUrl = "https://<your-storage-account-name>.dfs.core.windows.net";

// Authenticate using client credentials
var credential = new ClientSecretCredential(tenantId, clientId, clientSecret);
var serviceClient = new DataLakeServiceClient(new Uri(lakeUrl), credential);

// Connect to the container (Lakehouse workspace)
var fileSystemClient = serviceClient.GetFileSystemClient("my-workspace");

// Set the directory path including the logical 'Files' container
var directoryClient = fileSystemClient.GetDirectoryClient("Files/Subfolder1/Subfolder2");
await directoryClient.CreateIfNotExistsAsync();

// Upload the file
var fileClient = directoryClient.GetFileClient("MyFile.txt");
using var stream = File.OpenRead("MyFile.txt");
await fileClient.UploadAsync(stream, overwrite: true);

 

Replace:

  • <your-tenant-id> — from Azure AD
  • <your-service-principal-client-id> — app registration's "Application (client) ID"
  • <your-secret> — from app registration > Certificates & secrets
  • <your-storage-account> — name of the account (you’ll see this in Fabric if you dig into the linked lakehouse)

 

Thanks,

Prashanth Are

MS Fabric community support

 

 

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.