<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: UDF Connection to Lakehouse does not work in Data Engineering</title>
    <link>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4763072#M10918</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/1250613"&gt;@v-venuppu&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;Thanks for the reply. I triple checked and the alias is exactly matching.&amp;nbsp;&lt;BR /&gt;Also the code change did not help. The problem is the connection to the lakehouse.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="michael_muell_0-1752560660545.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1283096i2331110EFBDF3CF6/image-size/medium?v=v2&amp;amp;px=400" role="button" title="michael_muell_0-1752560660545.png" alt="michael_muell_0-1752560660545.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any other suggestions?&lt;BR /&gt;&lt;BR /&gt;Best regards&amp;nbsp;&lt;BR /&gt;Michael&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Tue, 15 Jul 2025 06:24:50 GMT</pubDate>
    <dc:creator>michael_muell</dc:creator>
    <dc:date>2025-07-15T06:24:50Z</dc:date>
    <item>
      <title>UDF Connection to Lakehouse does not work</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4760308#M10830</link>
      <description>&lt;P&gt;Hi all,&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm trying to get a UDF to work that creates a file in the lakehouse. I saw that there is a sample that almost achieves what I want. Unfortunately it does not work when I connect my lakehouse. I get the following error:&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Running another sample snippet connecting with a warehouse item works perfectly fine. It seems to be some problem specifically with the lakehouse connection.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Can anyone help?&amp;nbsp;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;{&lt;BR /&gt;"functionName": "write_csv_file_in_lakehouse",&lt;BR /&gt;"invocationId": "00000000-0000-0000-0000-000000000000",&lt;BR /&gt;"status": "Failed",&lt;BR /&gt;"errors": [&lt;BR /&gt;{&lt;BR /&gt;"errorCode": "WorkloadException",&lt;BR /&gt;"subErrorCode": "NotFound",&lt;BR /&gt;"message": "User data function: 'write_csv_file_in_lakehouse' invocation failed."&lt;BR /&gt;}&lt;BR /&gt;]&lt;BR /&gt;}&lt;BR /&gt;&lt;BR /&gt;The connection to the datasource is setup:&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="michael_muell_1-1752253080383.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1282519i3E144CE74689E03B/image-size/medium?v=v2&amp;amp;px=400" role="button" title="michael_muell_1-1752253080383.png" alt="michael_muell_1-1752253080383.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;This is the code I'm running:&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; datetime&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; fabric.functions &lt;/SPAN&gt;&lt;SPAN&gt;as&lt;/SPAN&gt;&lt;SPAN&gt; fn&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; logging&lt;/SPAN&gt;&lt;/DIV&gt;&lt;BR /&gt;&lt;DIV&gt;&lt;SPAN&gt;udf = fn.UserDataFunctions()&lt;/SPAN&gt;&lt;/DIV&gt;&lt;BR /&gt;&lt;DIV&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; pandas &lt;/SPAN&gt;&lt;SPAN&gt;as&lt;/SPAN&gt;&lt;SPAN&gt; pd &lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; datetime&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;# Select 'Manage connections' and add a connection to a Lakehouse.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;#Replace the alias "&amp;lt;My Lakehouse alias&amp;gt;" with your connection alias.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;@udf&lt;/SPAN&gt;&lt;SPAN&gt;.connection(argName=&lt;/SPAN&gt;&lt;SPAN&gt;"myLakehouse"&lt;/SPAN&gt;&lt;SPAN&gt;, alias=&lt;/SPAN&gt;&lt;SPAN&gt;"dateiupload"&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;@udf&lt;/SPAN&gt;&lt;SPAN&gt;.function()&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;def&lt;/SPAN&gt;&lt;SPAN&gt; write_csv_file_in_lakehouse(myLakehouse: fn.FabricLakehouseClient, employees: &lt;/SPAN&gt;&lt;SPAN&gt;list&lt;/SPAN&gt;&lt;SPAN&gt;)-&amp;gt; &lt;/SPAN&gt;&lt;SPAN&gt;str&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;BR /&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; logging.info(&lt;/SPAN&gt;&lt;SPAN&gt;'test'&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;'''&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; Description: Write employee data to lakehouse as timestamped CSV file using pandas.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; Args:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; myLakehouse (fn.FabricLakehouseClient): Fabric lakehouse connection.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; employees (list): List of employee records as [ID, Name, DeptID] arrays.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; Returns:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; str: Confirmation message with filename and viewing instructions.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; Example:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; employees = [[1,"John Smith", 31], [2,"Kayla Jones", 33]]&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Creates "Employees1672531200.csv" in lakehouse&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; '''&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; csvFileName = &lt;/SPAN&gt;&lt;SPAN&gt;"Employees"&lt;/SPAN&gt;&lt;SPAN&gt; + &lt;/SPAN&gt;&lt;SPAN&gt;str&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;round&lt;/SPAN&gt;&lt;SPAN&gt;(datetime.datetime.now().timestamp())) + &lt;/SPAN&gt;&lt;SPAN&gt;".csv"&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;# Convert the data to a DataFrame&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; df = pd.DataFrame(employees, columns=[&lt;/SPAN&gt;&lt;SPAN&gt;'ID'&lt;/SPAN&gt;&lt;SPAN&gt;,&lt;/SPAN&gt;&lt;SPAN&gt;'EmpName'&lt;/SPAN&gt;&lt;SPAN&gt;, &lt;/SPAN&gt;&lt;SPAN&gt;'DepID'&lt;/SPAN&gt;&lt;SPAN&gt;])&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;# Write the DataFrame to a CSV file&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; csv_string = df.to_csv(index=&lt;/SPAN&gt;&lt;SPAN&gt;False&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;# Upload the CSV file to the Lakehouse&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; connection = myLakehouse.connectToFiles()&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; csvFile = connection.get_file_client(csvFileName) &amp;nbsp;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; csvFile.upload_data(csv_string, overwrite=&lt;/SPAN&gt;&lt;SPAN&gt;True&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;BR /&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; csvFile.close()&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; connection.close()&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;return&lt;/SPAN&gt;&lt;SPAN&gt; f&lt;/SPAN&gt;&lt;SPAN&gt;"File {csvFileName} was written to the Lakehouse. Open the Lakehouse in &lt;A href="https://app.fabric.microsoft.com" target="_blank"&gt;https://app.fabric.microsoft.com&lt;/A&gt; to view the files"&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Fri, 11 Jul 2025 17:02:48 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4760308#M10830</guid>
      <dc:creator>michael_muell</dc:creator>
      <dc:date>2025-07-11T17:02:48Z</dc:date>
    </item>
    <item>
      <title>Re: UDF Connection to Lakehouse does not work</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4761589#M10864</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/1307813"&gt;@michael_muell&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;Thank you for reaching out to Microsoft Fabric Community.&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-teams="true"&gt;The issue is likely caused by a mismatch in the Lakehouse connection alias or trying to upload string data using the wrong method. Make sure your UDF connection alias (dateiupload) matches exactly what’s set in the UDF UI. Also, replace upload_data with upload_text since you're uploading a CSV string, not binary data. Here's the fix:&lt;BR /&gt;csvFile.upload_text(csv_string, overwrite=True)&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-teams="true"&gt;Here's a documentation for your reference:&lt;BR /&gt;&lt;A href="https://learn.microsoft.com/en-us/python/api/fabric-user-data-functions/fabric.functions.fabriclakehouseclient?view=fabric-user-data-functions-python-latest" target="_blank"&gt;fabric.functions.FabricLakehouseClient class | Microsoft Learn&lt;/A&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-teams="true"&gt;Thank you.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;LI-WRAPPER&gt;&lt;/LI-WRAPPER&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 14 Jul 2025 07:11:42 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4761589#M10864</guid>
      <dc:creator>v-venuppu</dc:creator>
      <dc:date>2025-07-14T07:11:42Z</dc:date>
    </item>
    <item>
      <title>Re: UDF Connection to Lakehouse does not work</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4763072#M10918</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/1250613"&gt;@v-venuppu&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;Thanks for the reply. I triple checked and the alias is exactly matching.&amp;nbsp;&lt;BR /&gt;Also the code change did not help. The problem is the connection to the lakehouse.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="michael_muell_0-1752560660545.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1283096i2331110EFBDF3CF6/image-size/medium?v=v2&amp;amp;px=400" role="button" title="michael_muell_0-1752560660545.png" alt="michael_muell_0-1752560660545.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any other suggestions?&lt;BR /&gt;&lt;BR /&gt;Best regards&amp;nbsp;&lt;BR /&gt;Michael&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 15 Jul 2025 06:24:50 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4763072#M10918</guid>
      <dc:creator>michael_muell</dc:creator>
      <dc:date>2025-07-15T06:24:50Z</dc:date>
    </item>
    <item>
      <title>Re: UDF Connection to Lakehouse does not work</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4763266#M10922</link>
      <description>&lt;P&gt;This is how it works:&lt;BR /&gt;Python (Pandas ) -- &amp;gt; Apache Spark ( Data Lake ) --&amp;gt; Delta Lake ( Data Lakehouse )&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;# Convert the data to a DataFrame&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; df = pd.DataFrame(employees, columns=[&lt;/SPAN&gt;&lt;SPAN&gt;'ID'&lt;/SPAN&gt;&lt;SPAN&gt;,&lt;/SPAN&gt;&lt;SPAN&gt;'EmpName'&lt;/SPAN&gt;&lt;SPAN&gt;,&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;'DepID'&lt;/SPAN&gt;&lt;SPAN&gt;])&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;# Covert DataFrame to Data Lake&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;sdf = spark.createDataFrame(df)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;or&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;sdf = spark.read.csv("&lt;SPAN&gt;Employees_"&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;+ str(round(datetime.datetime.now().timestamp())) +&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;".csv"&lt;/SPAN&gt;,header = True)&lt;BR /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN class=""&gt;# Writing a DataFrame to Delta format&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;sdf.write.&lt;/SPAN&gt;&lt;SPAN class=""&gt;format&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN class=""&gt;"delta"&lt;/SPAN&gt;&lt;SPAN&gt;).option("overwrite").saveAsTable(&lt;/SPAN&gt;&lt;SPAN class=""&gt;"Dim&lt;SPAN&gt;Employees&lt;/SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;You are missing Spark Dataframe ( Data Lake ).&lt;/STRONG&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 18 Jul 2025 07:59:25 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4763266#M10922</guid>
      <dc:creator>BhaveshPatel</dc:creator>
      <dc:date>2025-07-18T07:59:25Z</dc:date>
    </item>
    <item>
      <title>Re: UDF Connection to Lakehouse does not work</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4767751#M11024</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/1307813"&gt;@michael_muell&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;Please use the below code:&lt;/P&gt;
&lt;P&gt;import pandas as pd&lt;/P&gt;
&lt;P&gt;import datetime&lt;/P&gt;
&lt;P&gt;import fabric.functions as fn&lt;/P&gt;
&lt;P&gt;import logging&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;udf = fn.UserDataFunctions()&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;@udf.connection(argName=&lt;SPAN&gt;"myLakehouse"&lt;/SPAN&gt;, alias=&lt;SPAN&gt;"LH123"&lt;/SPAN&gt;)&lt;/P&gt;
&lt;P&gt;@udf.function()&lt;/P&gt;
&lt;P&gt;def write_csv_file_in_lakehouse(myLakehouse: fn.FabricLakehouseClient, employees: list) -&amp;gt; str:&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;SPAN&gt;"""&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; Writes employee data to Lakehouse Files as a CSV.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; """&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; logging.info(&lt;SPAN&gt;"Starting CSV file write to Lakehouse"&lt;/SPAN&gt;)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; # Create timestamped filename&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; csvFileName = &lt;SPAN&gt;"Employees_"&lt;/SPAN&gt; + str(round(datetime.datetime.now().timestamp())) + &lt;SPAN&gt;".csv"&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; # Create DataFrame and CSV string&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; df = pd.DataFrame(employees, columns=[&lt;SPAN&gt;"ID"&lt;/SPAN&gt;, &lt;SPAN&gt;"EmpName"&lt;/SPAN&gt;, &lt;SPAN&gt;"DepID"&lt;/SPAN&gt;])&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; csv_string = df.to_csv(index=False)&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; csv_bytes = csv_string.encode(&lt;SPAN&gt;"utf-8"&lt;/SPAN&gt;) &amp;nbsp;# Convert string to bytes&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; # Connect to Lakehouse Files and upload&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; connection = myLakehouse.connectToFiles()&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; file_client = connection.get_file_client(csvFileName)&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; file_client.upload_data(csv_bytes, overwrite=True)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; # Close connections&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; file_client.close()&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; connection.close()&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; return f&lt;SPAN&gt;"File '{csvFileName}' was uploaded successfully."&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;LI-WRAPPER&gt;&lt;/LI-WRAPPER&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Add Pandas library as shown below:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vvenuppu_0-1752812737726.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1284177iD1F9C95F705F925C/image-size/medium?v=v2&amp;amp;px=400" role="button" title="vvenuppu_0-1752812737726.png" alt="vvenuppu_0-1752812737726.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-teams="true"&gt;To test this I have created pipeline and it worked for me:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vvenuppu_1-1752812837093.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1284179i890D7CFFF13BF859/image-size/medium?v=v2&amp;amp;px=400" role="button" title="vvenuppu_1-1752812837093.png" alt="vvenuppu_1-1752812837093.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;File got created in Lakehouse as shown below:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vvenuppu_2-1752812904352.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1284180iA1DF890FEDB67D66/image-size/medium?v=v2&amp;amp;px=400" role="button" title="vvenuppu_2-1752812904352.png" alt="vvenuppu_2-1752812904352.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 18 Jul 2025 04:31:55 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4767751#M11024</guid>
      <dc:creator>v-venuppu</dc:creator>
      <dc:date>2025-07-18T04:31:55Z</dc:date>
    </item>
    <item>
      <title>Re: UDF Connection to Lakehouse does not work</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4768623#M11052</link>
      <description>&lt;P&gt;This works! Thanks a lot!&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 18 Jul 2025 13:55:35 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/4768623#M11052</guid>
      <dc:creator>michael_muell</dc:creator>
      <dc:date>2025-07-18T13:55:35Z</dc:date>
    </item>
    <item>
      <title>Re: UDF Connection to Lakehouse does not work</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/5124172#M15277</link>
      <description>&lt;P&gt;If you are working in production environment or a corporate setup, usually all the communication between fabric items is set to be private. In that case with having all correct setup you still face proxy 400 error. If this is the case then you have to consult with Admins to whitelist or allow comunication between UDF and lakehouse.&lt;/P&gt;</description>
      <pubDate>Mon, 02 Mar 2026 06:13:30 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/UDF-Connection-to-Lakehouse-does-not-work/m-p/5124172#M15277</guid>
      <dc:creator>usmanFawad</dc:creator>
      <dc:date>2026-03-02T06:13:30Z</dc:date>
    </item>
  </channel>
</rss>

