Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
I'm trying to use delta-rs write_deltalake() in a Fabric UDF that I'm converting from testing in a Fabric notebook. I've got the following code:
storage_options = {'bearer_token': notebookutils.credentials.getToken('storage'), 'use_fabric_endpoint': 'true'}
table_path = 'abfss://[workspace_id]@onelake.dfs.fabric.microsoft.com/[lakehouse_id]/Tables/dbo/[table_name]'
write_deltalake(table_path, data=pa.Table.from_pandas(df), mode='overwrite', storage_options=storage_options)
This works great in my notebook and writes the data in my pandas dataframe to the table in my lakehouse
I'm trying to put this code in a Fabric UDF, but I don't know how to authenticate properly. I've tried using credential = DefaultAzureCredential(), but it fails.
How can I properly authenticate in my Fabric UDF to do what those three lines above do so easily?
Solved! Go to Solution.
Hello @diablo9081
Create a connection to your target Lakehouse inside your UDF item (Manage connections > Add data connection). Note the generated alias; you’ll have to put it in the decorator. Lakehouse connections provide read/write to Files.
Add libraries to the UDF item (Toolbar → Library management):
polars (for the DataFrame)
Publish the library changes. Public PyPI packages are supported in UDFs.
from azure.identity import ClientSecretCredential
from azure.storage.filedatalake import DataLakeServiceClient
tenant_id = "<tenant_guid>"
client_id = "<app_client_id>"
client_secret= "<client_secret>"
cred = ClientSecretCredential(tenant_id, client_id, client_secret)
svc = DataLakeServiceClient(
account_url="https://onelake.dfs.fabric.microsoft.com",
credential=cred
)
Hi @diablo9081,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
Hi @diablo9081,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
@deborshi_nag ,Thanks for your prompt response
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
Hello @diablo9081
Create a connection to your target Lakehouse inside your UDF item (Manage connections > Add data connection). Note the generated alias; you’ll have to put it in the decorator. Lakehouse connections provide read/write to Files.
Add libraries to the UDF item (Toolbar → Library management):
polars (for the DataFrame)
Publish the library changes. Public PyPI packages are supported in UDFs.
from azure.identity import ClientSecretCredential
from azure.storage.filedatalake import DataLakeServiceClient
tenant_id = "<tenant_guid>"
client_id = "<app_client_id>"
client_secret= "<client_secret>"
cred = ClientSecretCredential(tenant_id, client_id, client_secret)
svc = DataLakeServiceClient(
account_url="https://onelake.dfs.fabric.microsoft.com",
credential=cred
)
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 13 | |
| 6 | |
| 5 | |
| 4 | |
| 4 |
| User | Count |
|---|---|
| 23 | |
| 20 | |
| 14 | |
| 12 | |
| 12 |