Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
amaaiia
Super User
Super User

How to send an email with an attached file from notebook

Hi,

I have a notebook with the code to send an email. The email can be send with an attached file. My code is able to send an attached file located in Files/emails path of the default lakehouse on the notebook.

 

attachment = open(relative_attachment_path, 'rb')

 

I want to change the code so I can be able to attach a file located in any lakehouse (not the default one), providing the workspace_id and the lakehouse_id. I've seen this piece of code, I dont' know if it will work, but I don't know which is the connection string of the blob storage. Where can I get it? I don't have a storage account in Azure for Fabric.

# Configure Azure Blob Storage client
connection_string = 'your_connection_string'
container_name = f'workspace-{workspace_id}-lakehouse-{lakehouse_id}'
blob_name = 'path/to/file.xlsx'

blob_service_client = BlobServiceClient.from_connection_string(connection_string)
blob_client = blob_service_client.get_blob_client(container=container_name, blob=blob_name)

# Download the file from lakehouse
downloader = blob_client.download_blob()
file_content = downloader.readall()

 Thanks

1 ACCEPTED SOLUTION
nilendraFabric
Solution Supplier
Solution Supplier

Hello @amaaiia 

 

You don’t need a traditional Azure Blob Storage connection string to access files from another lakehouse in Fabric. Each Fabric lakehouse is stored in OneLake, and you can directly mount it to your notebook session using its ABFS path. Then you can open the file from the local mount path and attach it to your email. For example

 

from notebookutils import mssparkutils

# Provide the workspace and lakehouse IDs of the lakehouse you want to read
workspace_id = "<YOUR_WORKSPACE_ID>"
lakehouse_id = "<YOUR_LAKEHOUSE_ID>"

# Construct the ABFS path for the target lakehouse
abfs_path = f"abfss://{workspace_id}@onelake.dfs.fabric.microsoft.com/{lakehouse_id}"

# Mount the lakehouse to a local folder in your notebook
mount_point = "/mnt/external_lakehouse"
mssparkutils.fs.mount(abfs_path, mount_point)

# Use the mounted path to open the file
local_file_path = f"{mount_point}/Files/path/to/file.xlsx"
attachment = open(local_file_path, "rb")

# Your email sending logic
send_email(
subject="File from another lakehouse",
body="See attached",
attachment_content=attachment.read(),
attachment_name="file.xlsx"
)
attachment.close()

 

Please give it a try and let me if this works 

 

https://learn.microsoft.com/en-us/fabric/data-engineering/microsoft-spark-utilities

 

Thanks

 

View solution in original post

3 REPLIES 3
nilendraFabric
Solution Supplier
Solution Supplier

Hello @amaaiia 

 

You don’t need a traditional Azure Blob Storage connection string to access files from another lakehouse in Fabric. Each Fabric lakehouse is stored in OneLake, and you can directly mount it to your notebook session using its ABFS path. Then you can open the file from the local mount path and attach it to your email. For example

 

from notebookutils import mssparkutils

# Provide the workspace and lakehouse IDs of the lakehouse you want to read
workspace_id = "<YOUR_WORKSPACE_ID>"
lakehouse_id = "<YOUR_LAKEHOUSE_ID>"

# Construct the ABFS path for the target lakehouse
abfs_path = f"abfss://{workspace_id}@onelake.dfs.fabric.microsoft.com/{lakehouse_id}"

# Mount the lakehouse to a local folder in your notebook
mount_point = "/mnt/external_lakehouse"
mssparkutils.fs.mount(abfs_path, mount_point)

# Use the mounted path to open the file
local_file_path = f"{mount_point}/Files/path/to/file.xlsx"
attachment = open(local_file_path, "rb")

# Your email sending logic
send_email(
subject="File from another lakehouse",
body="See attached",
attachment_content=attachment.read(),
attachment_name="file.xlsx"
)
attachment.close()

 

Please give it a try and let me if this works 

 

https://learn.microsoft.com/en-us/fabric/data-engineering/microsoft-spark-utilities

 

Thanks

 

It works BUT:

1. Instead of:

local_file_path = f"{mount_point}/Files/path/to/file.xlsx"
attachment = open(local_file_path, "rb")

You need to read the absolute path, and the notebook_id should be added. That is, when you mount the lakehouse to /mnt/external_lakehouse, the truth is that is being mounted to:

/synfs/notebook/{notebook_id}/mnt/external_lakehouse

Great insights @amaaiia!

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.

JanFabricDE_carousel

Fabric Monthly Update - January 2025

Explore the power of Python Notebooks in Fabric!

JanFabricDW_carousel

Fabric Monthly Update - January 2025

Unlock the latest Fabric Data Warehouse upgrades!

JanFabricDF_carousel

Fabric Monthly Update - January 2025

Take your data replication to the next level with Fabric's latest updates!