Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi,
I have a notebook with the code to send an email. The email can be send with an attached file. My code is able to send an attached file located in Files/emails path of the default lakehouse on the notebook.
attachment = open(relative_attachment_path, 'rb')
I want to change the code so I can be able to attach a file located in any lakehouse (not the default one), providing the workspace_id and the lakehouse_id. I've seen this piece of code, I dont' know if it will work, but I don't know which is the connection string of the blob storage. Where can I get it? I don't have a storage account in Azure for Fabric.
# Configure Azure Blob Storage client
connection_string = 'your_connection_string'
container_name = f'workspace-{workspace_id}-lakehouse-{lakehouse_id}'
blob_name = 'path/to/file.xlsx'
blob_service_client = BlobServiceClient.from_connection_string(connection_string)
blob_client = blob_service_client.get_blob_client(container=container_name, blob=blob_name)
# Download the file from lakehouse
downloader = blob_client.download_blob()
file_content = downloader.readall()
Thanks
Solved! Go to Solution.
Hello @amaaiia
You don’t need a traditional Azure Blob Storage connection string to access files from another lakehouse in Fabric. Each Fabric lakehouse is stored in OneLake, and you can directly mount it to your notebook session using its ABFS path. Then you can open the file from the local mount path and attach it to your email. For example
from notebookutils import mssparkutils
# Provide the workspace and lakehouse IDs of the lakehouse you want to read
workspace_id = "<YOUR_WORKSPACE_ID>"
lakehouse_id = "<YOUR_LAKEHOUSE_ID>"
# Construct the ABFS path for the target lakehouse
abfs_path = f"abfss://{workspace_id}@onelake.dfs.fabric.microsoft.com/{lakehouse_id}"
# Mount the lakehouse to a local folder in your notebook
mount_point = "/mnt/external_lakehouse"
mssparkutils.fs.mount(abfs_path, mount_point)
# Use the mounted path to open the file
local_file_path = f"{mount_point}/Files/path/to/file.xlsx"
attachment = open(local_file_path, "rb")
# Your email sending logic
send_email(
subject="File from another lakehouse",
body="See attached",
attachment_content=attachment.read(),
attachment_name="file.xlsx"
)
attachment.close()
Please give it a try and let me if this works
https://learn.microsoft.com/en-us/fabric/data-engineering/microsoft-spark-utilities
Thanks
Hello @amaaiia
You don’t need a traditional Azure Blob Storage connection string to access files from another lakehouse in Fabric. Each Fabric lakehouse is stored in OneLake, and you can directly mount it to your notebook session using its ABFS path. Then you can open the file from the local mount path and attach it to your email. For example
from notebookutils import mssparkutils
# Provide the workspace and lakehouse IDs of the lakehouse you want to read
workspace_id = "<YOUR_WORKSPACE_ID>"
lakehouse_id = "<YOUR_LAKEHOUSE_ID>"
# Construct the ABFS path for the target lakehouse
abfs_path = f"abfss://{workspace_id}@onelake.dfs.fabric.microsoft.com/{lakehouse_id}"
# Mount the lakehouse to a local folder in your notebook
mount_point = "/mnt/external_lakehouse"
mssparkutils.fs.mount(abfs_path, mount_point)
# Use the mounted path to open the file
local_file_path = f"{mount_point}/Files/path/to/file.xlsx"
attachment = open(local_file_path, "rb")
# Your email sending logic
send_email(
subject="File from another lakehouse",
body="See attached",
attachment_content=attachment.read(),
attachment_name="file.xlsx"
)
attachment.close()
Please give it a try and let me if this works
https://learn.microsoft.com/en-us/fabric/data-engineering/microsoft-spark-utilities
Thanks
It works BUT:
1. Instead of:
local_file_path = f"{mount_point}/Files/path/to/file.xlsx"
attachment = open(local_file_path, "rb")
You need to read the absolute path, and the notebook_id should be added. That is, when you mount the lakehouse to /mnt/external_lakehouse, the truth is that is being mounted to:
User | Count |
---|---|
30 | |
10 | |
4 | |
3 | |
1 |
User | Count |
---|---|
45 | |
15 | |
14 | |
10 | |
9 |