Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.
Hi all,
I'm trying to get a UDF to work that creates a file in the lakehouse. I saw that there is a sample that almost achieves what I want. Unfortunately it does not work when I connect my lakehouse. I get the following error:
Running another sample snippet connecting with a warehouse item works perfectly fine. It seems to be some problem specifically with the lakehouse connection.
Can anyone help?
{
"functionName": "write_csv_file_in_lakehouse",
"invocationId": "00000000-0000-0000-0000-000000000000",
"status": "Failed",
"errors": [
{
"errorCode": "WorkloadException",
"subErrorCode": "NotFound",
"message": "User data function: 'write_csv_file_in_lakehouse' invocation failed."
}
]
}
The connection to the datasource is setup:
This is the code I'm running:
Solved! Go to Solution.
Hi @michael_muell ,
Please use the below code:
import pandas as pd
import datetime
import fabric.functions as fn
import logging
udf = fn.UserDataFunctions()
@udf.connection(argName="myLakehouse", alias="LH123")
@udf.function()
def write_csv_file_in_lakehouse(myLakehouse: fn.FabricLakehouseClient, employees: list) -> str:
"""
Writes employee data to Lakehouse Files as a CSV.
"""
logging.info("Starting CSV file write to Lakehouse")
# Create timestamped filename
csvFileName = "Employees_" + str(round(datetime.datetime.now().timestamp())) + ".csv"
# Create DataFrame and CSV string
df = pd.DataFrame(employees, columns=["ID", "EmpName", "DepID"])
csv_string = df.to_csv(index=False)
csv_bytes = csv_string.encode("utf-8") # Convert string to bytes
# Connect to Lakehouse Files and upload
connection = myLakehouse.connectToFiles()
file_client = connection.get_file_client(csvFileName)
file_client.upload_data(csv_bytes, overwrite=True)
# Close connections
file_client.close()
connection.close()
return f"File '{csvFileName}' was uploaded successfully."
Add Pandas library as shown below:
To test this I have created pipeline and it worked for me:
File got created in Lakehouse as shown below:
Thank you.
Hi @michael_muell ,
Please use the below code:
import pandas as pd
import datetime
import fabric.functions as fn
import logging
udf = fn.UserDataFunctions()
@udf.connection(argName="myLakehouse", alias="LH123")
@udf.function()
def write_csv_file_in_lakehouse(myLakehouse: fn.FabricLakehouseClient, employees: list) -> str:
"""
Writes employee data to Lakehouse Files as a CSV.
"""
logging.info("Starting CSV file write to Lakehouse")
# Create timestamped filename
csvFileName = "Employees_" + str(round(datetime.datetime.now().timestamp())) + ".csv"
# Create DataFrame and CSV string
df = pd.DataFrame(employees, columns=["ID", "EmpName", "DepID"])
csv_string = df.to_csv(index=False)
csv_bytes = csv_string.encode("utf-8") # Convert string to bytes
# Connect to Lakehouse Files and upload
connection = myLakehouse.connectToFiles()
file_client = connection.get_file_client(csvFileName)
file_client.upload_data(csv_bytes, overwrite=True)
# Close connections
file_client.close()
connection.close()
return f"File '{csvFileName}' was uploaded successfully."
Add Pandas library as shown below:
To test this I have created pipeline and it worked for me:
File got created in Lakehouse as shown below:
Thank you.
This works! Thanks a lot!
This is how it works:
Python (Pandas ) -- > Apache Spark ( Data Lake ) --> Delta Lake ( Data Lakehouse )
You are missing Spark Dataframe ( Data Lake ).
Hi @michael_muell ,
Thank you for reaching out to Microsoft Fabric Community.
The issue is likely caused by a mismatch in the Lakehouse connection alias or trying to upload string data using the wrong method. Make sure your UDF connection alias (dateiupload) matches exactly what’s set in the UDF UI. Also, replace upload_data with upload_text since you're uploading a CSV string, not binary data. Here's the fix:
csvFile.upload_text(csv_string, overwrite=True)
Here's a documentation for your reference:
fabric.functions.FabricLakehouseClient class | Microsoft Learn
Thank you.
Hi @v-venuppu
Thanks for the reply. I triple checked and the alias is exactly matching.
Also the code change did not help. The problem is the connection to the lakehouse.
Any other suggestions?
Best regards
Michael