Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello,
I am wondering how I can do a SELECT on a Lakehouse from an external "home-made" tool. Actulally, I would like to send my SQL statement via Fabric API, but, in the folowing documentatio, I am not able to find the appropriate call: https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-api
Thank you by advance for your help,
Best regards,
Solved! Go to Solution.
Hi @Nicolas_DHONT ,
You can only do a select on Lakehouse using SQLEndpoint by utilising ODBC driver from local.
Fetch sql endpoint of your lakehouse, authentication using your identity
Example:
Hello @Nicolas_DHONT - you can query a lakehouse via an API call by creating a user data function. Below is a script for a UDF that I created which queries a lakehouse and returns a list of tables and their respective schemas.
You can customize this by editing the alias and SELECT statement sections which I've highlighted in the snip below.
Test the function by clicking the play icon from the function name in the function explorer. Then the Run pane will open on the right and you can click the Run button.
Once you are satisfied with the test result, click Generate invokation code and choose between invoking via Python or a Client and copy the respective script.
This will give you the code to copy so you can invoke the UDF externally.
UDF Script
import fabric.functions as fn
udf = fn.UserDataFunctions()
# Go to Manage Connections and add a connection to your lakehouse, note the alias
# Replace "Community" with your actual Lakehouse connection alias
@udf.connection(argName="lakehouse", alias="Community")
@udf.function()
def list_all_tables(lakehouse: fn.FabricLakehouseClient) -> list:
connection = lakehouse.connectToSql()
cursor = connection.cursor()
# Query to list all tables in the Lakehouse
cursor.execute("SELECT TABLE_SCHEMA, TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE'")
rows = [x for x in cursor]
columnNames = [x[0] for x in cursor.description]
tables = []
for row in rows:
table_info = {}
for col, val in zip(columnNames, row):
table_info[col] = val
tables.append(table_info)
cursor.close()
connection.close()
return tables
If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.
Proud to be a Microsoft Fabric Super User
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hello,
Yes, I validated 2 different solutions
Thank you!
Hello @Nicolas_DHONT - you can query a lakehouse via an API call by creating a user data function. Below is a script for a UDF that I created which queries a lakehouse and returns a list of tables and their respective schemas.
You can customize this by editing the alias and SELECT statement sections which I've highlighted in the snip below.
Test the function by clicking the play icon from the function name in the function explorer. Then the Run pane will open on the right and you can click the Run button.
Once you are satisfied with the test result, click Generate invokation code and choose between invoking via Python or a Client and copy the respective script.
This will give you the code to copy so you can invoke the UDF externally.
UDF Script
import fabric.functions as fn
udf = fn.UserDataFunctions()
# Go to Manage Connections and add a connection to your lakehouse, note the alias
# Replace "Community" with your actual Lakehouse connection alias
@udf.connection(argName="lakehouse", alias="Community")
@udf.function()
def list_all_tables(lakehouse: fn.FabricLakehouseClient) -> list:
connection = lakehouse.connectToSql()
cursor = connection.cursor()
# Query to list all tables in the Lakehouse
cursor.execute("SELECT TABLE_SCHEMA, TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE'")
rows = [x for x in cursor]
columnNames = [x[0] for x in cursor.description]
tables = []
for row in rows:
table_info = {}
for col, val in zip(columnNames, row):
table_info[col] = val
tables.append(table_info)
cursor.close()
connection.close()
return tables
If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.
Proud to be a Microsoft Fabric Super User
Hello,
Thank you @yvaubourg and @Srisakthi for your replies!
I tried the Python sript you provided @Srisakthi but this is not working due to MFA:
Is there another way to authenticate (maybe with "Application registration" from Azure Portal) to allow a script to automatically connect (disabling MFA on a user is not a best practice) ?
Thank you !
Hi @Nicolas_DHONT ,
Yeah you can try with Service Principal, but few permissions needs to be enabled to use it.
Regards,
Srisakthi
Thank you @Srisakthi !
I need to check how don the call with a Service Principal (an the app registration) but it looks the good option 😉
Thanks and best regards,
Nicolas
Hi @Nicolas_DHONT ,
You can only do a select on Lakehouse using SQLEndpoint by utilising ODBC driver from local.
Fetch sql endpoint of your lakehouse, authentication using your identity
Example:
Hi,
Unfortunatly, the Fabric API does not allow to get your Data. It's here to manage Fabric artifacts.
You have to use the SQL Endpoint of your lakehouse or to out in place a GraphQL API to get your Data outside MS Fabric.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
5 | |
3 | |
2 | |
2 | |
2 |