Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Nicolas_DHONT
Regular Visitor

Select data from Lakehouses via Fabric API

Hello,

I am wondering how I can do a SELECT on a Lakehouse from an external "home-made" tool. Actulally, I would like to send my SQL statement via Fabric API, but, in the folowing documentatio, I am not able to find the appropriate call: https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-api

Thank you by advance for your help,

Best regards,

2 ACCEPTED SOLUTIONS
Srisakthi
Super User
Super User

Hi @Nicolas_DHONT ,

 

You can only do a select on Lakehouse using SQLEndpoint by utilising ODBC driver from local. 

Fetch sql endpoint of your lakehouse, authentication using your identity

Example:

server_name ='jkshdjhsdf.datawarehouse.pbidedicated.windows.net'
db_name = 'You lakehouse name'
user_name ='abc@xyz.com'
password = 'xxxxxx'
mydb = pyodbc.connect('DRIVER=' + driver + ';SERVER=' + server_name + ';PORT=1433;DATABASE=' + db_name + ';UID=' + user_name + ';PWD=' + password + ';Authentication=ActiveDirectoryPassword;')
cursor = mydb.cursor()
# Sample select query
cursor.execute("SELECT TOP(10) * FROM INFORMATION_SCHEMA.TABLES")
row = cursor.fetchone()
while row:
    print(row)
    row = cursor.fetchone()
 
Regards,
Srisakthi
 
If this answer helps please mark "Accept as Solution" 

View solution in original post

jennratten
Super User
Super User

Hello @Nicolas_DHONT - you can query a lakehouse via an API call by creating a user data function.  Below is a script for a UDF that I created which queries a lakehouse and returns a list of tables and their respective schemas.

You can customize this by editing the alias and SELECT statement sections which I've highlighted in the snip below.

 

jennratten_0-1747390508607.png

 

Test the function by clicking the play icon from the function name in the function explorer.  Then the Run pane will open on the right and you can click the Run button.

 

jennratten_1-1747390628442.png

Once you are satisfied with the test result, click Generate invokation code and choose between invoking via Python or a Client and copy the respective script.

jennratten_2-1747390844825.png

 

This will give you the code to copy so you can invoke the UDF externally.

jennratten_3-1747390959597.png

 

 UDF Script

import fabric.functions as fn

udf = fn.UserDataFunctions()
# Go to Manage Connections and add a connection to your lakehouse, note the alias
# Replace "Community" with your actual Lakehouse connection alias
@udf.connection(argName="lakehouse", alias="Community")
@udf.function()
def list_all_tables(lakehouse: fn.FabricLakehouseClient) -> list:
    connection = lakehouse.connectToSql()
    cursor = connection.cursor()

    # Query to list all tables in the Lakehouse
    cursor.execute("SELECT TABLE_SCHEMA, TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE'")
    
    rows = [x for x in cursor]
    columnNames = [x[0] for x in cursor.description]

    tables = []
    for row in rows:
        table_info = {}
        for col, val in zip(columnNames, row):
            table_info[col] = val
        tables.append(table_info)

    cursor.close()
    connection.close()

    return tables

 

 

 

If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.

Proud to be a Microsoft Fabric Super User

View solution in original post

9 REPLIES 9
v-priyankata
Community Support
Community Support

Hi @Nicolas_DHONT 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

Hello,
Yes, I validated 2 different solutions
Thank you!

jennratten
Super User
Super User

Hello @Nicolas_DHONT - you can query a lakehouse via an API call by creating a user data function.  Below is a script for a UDF that I created which queries a lakehouse and returns a list of tables and their respective schemas.

You can customize this by editing the alias and SELECT statement sections which I've highlighted in the snip below.

 

jennratten_0-1747390508607.png

 

Test the function by clicking the play icon from the function name in the function explorer.  Then the Run pane will open on the right and you can click the Run button.

 

jennratten_1-1747390628442.png

Once you are satisfied with the test result, click Generate invokation code and choose between invoking via Python or a Client and copy the respective script.

jennratten_2-1747390844825.png

 

This will give you the code to copy so you can invoke the UDF externally.

jennratten_3-1747390959597.png

 

 UDF Script

import fabric.functions as fn

udf = fn.UserDataFunctions()
# Go to Manage Connections and add a connection to your lakehouse, note the alias
# Replace "Community" with your actual Lakehouse connection alias
@udf.connection(argName="lakehouse", alias="Community")
@udf.function()
def list_all_tables(lakehouse: fn.FabricLakehouseClient) -> list:
    connection = lakehouse.connectToSql()
    cursor = connection.cursor()

    # Query to list all tables in the Lakehouse
    cursor.execute("SELECT TABLE_SCHEMA, TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE'")
    
    rows = [x for x in cursor]
    columnNames = [x[0] for x in cursor.description]

    tables = []
    for row in rows:
        table_info = {}
        for col, val in zip(columnNames, row):
            table_info[col] = val
        tables.append(table_info)

    cursor.close()
    connection.close()

    return tables

 

 

 

If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.

Proud to be a Microsoft Fabric Super User

Thank you very much @jennratten I will try this way next time !

Nicolas_DHONT
Regular Visitor

Hello,

Thank you @yvaubourg and @Srisakthi for your replies!

I tried the Python sript you provided @Srisakthi but this is not working due to MFA:

Nicolas_DHONT_0-1747299436357.png

Is there another way to authenticate (maybe with "Application registration" from Azure Portal) to allow a script to automatically connect (disabling MFA on a user is not a best practice) ?

Thank you !

Hi @Nicolas_DHONT ,

 

Yeah you can try with Service Principal, but few permissions needs to be enabled to use it.

Srisakthi_0-1747300603130.pngSrisakthi_1-1747300624907.png

 

 

Regards,

Srisakthi

Thank you @Srisakthi !

I need to check how don the call with a Service Principal (an the app registration) but it looks the good option 😉

Thanks and best regards,
Nicolas

Srisakthi
Super User
Super User

Hi @Nicolas_DHONT ,

 

You can only do a select on Lakehouse using SQLEndpoint by utilising ODBC driver from local. 

Fetch sql endpoint of your lakehouse, authentication using your identity

Example:

server_name ='jkshdjhsdf.datawarehouse.pbidedicated.windows.net'
db_name = 'You lakehouse name'
user_name ='abc@xyz.com'
password = 'xxxxxx'
mydb = pyodbc.connect('DRIVER=' + driver + ';SERVER=' + server_name + ';PORT=1433;DATABASE=' + db_name + ';UID=' + user_name + ';PWD=' + password + ';Authentication=ActiveDirectoryPassword;')
cursor = mydb.cursor()
# Sample select query
cursor.execute("SELECT TOP(10) * FROM INFORMATION_SCHEMA.TABLES")
row = cursor.fetchone()
while row:
    print(row)
    row = cursor.fetchone()
 
Regards,
Srisakthi
 
If this answer helps please mark "Accept as Solution" 
yvaubourg
Resolver I
Resolver I

Hi,

Unfortunatly, the Fabric API does not allow to get your Data. It's here to manage Fabric artifacts.

You have to use the SQL Endpoint of your lakehouse or to out in place a GraphQL API to get your Data outside MS Fabric.

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.