Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
banafsh
Frequent Visitor

Issue with Uploading CSV to Azure Storage Account from Fabric Python Notebook

Hello,

We're working on a task to upload CSV files to an Azure storage account using Fabric Python notebooks. However, we've encountered an issue where the connection fails unless we grant public access.

Below is a brief of our approach:

  1. We generate random names using the Python Faker library.
  2. These names are then structured into a Pandas DataFrame.
  3. We aim to save this DataFrame as a CSV file directly to our Azure Storage container without having to save it locally.

Here's the code snippet:

import pandas as pd
from faker import Faker
import io
from azure.storage.blob import BlobServiceClient

# Generate Random Names using Faker
fake = Faker()

# Number of names to generate
num_names = 1557

# Initialize lists to store names and last names
names = []
last_names = []

for _ in range(num_names😞
    names.append(fake.first_name())
    last_names.append(fake.last_name())

# Create a Pandas DataFrame
data = {'First Name': names, 'Last Name': last_names}
df = pd.DataFrame(data)

# Authenticate with Azure Blob Storage
connection_string = "connectionstring"
blob_service_client = BlobServiceClient.from_connection_string(connection_string)

# Get a client to interact with the 'testcontainer' container
container_client = blob_service_client.get_container_client("testcontainer")

# Convert DataFrame to CSV in-memory
output = io.StringIO()
df.to_csv(output, index=False)
csv_data = output.getvalue().encode('utf-8')

# Upload the CSV data to the blob in the 'testcontainer' container
blob_name = 'random_names.csv'
container_client.upload_blob(name=blob_name, data=csv_data)

print(f"Saved 1557 random names to '{blob_name}' in 'testcontainer'")
 
 

The code runs without errors, but the upload fails unless we give the container public access. We'd like to avoid setting our container to public.

Any guidance on resolving this issue while maintaining the security of our storage account would be greatly appreciated.

Thank you!

 
1 ACCEPTED SOLUTION

Hi @banafsh ,

Apologies for the issue you have been facing.

You cannot mount an Azure storage account without public access without using a private endpoint. Once you disable public access to a storage account, it can only be accessed from within a virtual network (VNet) using a private endpoint.

Private endpoints provide a secure way to connect to Azure services from within a VNet. They use a private IP address from the VNet address space for each storage account service, and network traffic between the clients on the VNet and the storage account traverses over the VNet and a private link on the Microsoft backbone network.

Currently private endpoints are not supported in Fabric. But they are on road map.

 

vnikhilanmsft_2-1698955128043.png

 

vnikhilanmsft_3-1698955139765.png

 

You can refer to planned timeline here Link1

 

Hope this helps. Please let us know if you have any further queries.

 

 

 

View solution in original post

4 REPLIES 4
banafsh
Frequent Visitor

Hi @v-nikhilan-msft,

 

Thank you for your response. I tried your suggested solution and it works when access is enabled from all networks, but as soon as I change the storage account access to "Enabled from selected virtual networks and IP addresses" it fails to upload the csv file. Any suggestions? 

 

Thank you!

Hi @banafsh ,

Apologies for the issue you have been facing.

You cannot mount an Azure storage account without public access without using a private endpoint. Once you disable public access to a storage account, it can only be accessed from within a virtual network (VNet) using a private endpoint.

Private endpoints provide a secure way to connect to Azure services from within a VNet. They use a private IP address from the VNet address space for each storage account service, and network traffic between the clients on the VNet and the storage account traverses over the VNet and a private link on the Microsoft backbone network.

Currently private endpoints are not supported in Fabric. But they are on road map.

 

vnikhilanmsft_2-1698955128043.png

 

vnikhilanmsft_3-1698955139765.png

 

You can refer to planned timeline here Link1

 

Hope this helps. Please let us know if you have any further queries.

 

 

 

Hi @banafsh ,
Glad that your issue got resolved. Please continue using Fabric Community for any help regarding your queries.

v-nikhilan-msft
Community Support
Community Support

Hi @banafsh ,
Thanks for using Fabric Community.
You can ingest the dataframe into the Azure Storage account by creating a mountpoint. I have created a repro and attached the screenshots for your reference.

1) Create a mountpoint by giving the names of your storage account and container.

from notebookutils import mssparkutils  
mssparkutils.fs.mount(  
    "abfss://mycontainer@<accountname>.blob.core.windows.net")

vnikhilanmsft_0-1698922076136.png

2) I have created a dataframe with some random names and ingested the data into storage account using df._csv() function. 

vnikhilanmsft_1-1698922279040.png


3) The file got placed in the container.

vnikhilanmsft_2-1698922356385.png

 

4) To perform this make sure you have the RBAC - Storage Blob Data Contributor role enabled.

vnikhilanmsft_3-1698922477297.png

 

Please refer to this document for more information: Link1


Hope this helps. Please let me know if you have any further questions.

 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.