Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
Richtpt
Frequent Visitor

How to send data to a Fabric Lakehouse from a Power Automate Flow?

I have a Power Automate Flow that runs when a new email shows up in a folder.  It captures data from the body of the email then writes that data to a SQL Server.  Works great.  Now I want to change this to write the data to a Fabric Lakehouse.  I've done a lot of googling and asking various AI's how to do this.  Most mention adding an "Execute a SQL Query (V2)" task.  I've done that, but when I run the flow, I'm getting BadRequest.

 

I'm guessing I have the connection to my Lakehouse setup incorrectly.  In the task, for Server Name, I'm using the SQL Endpoint from my Lakehouse settings.  For Database Name I'm using the name of my Lakehouse.  I then added a query that does an INSERT into one of the tables.  Right now that insert has hard-coded values just so I can figure out how to get this to work.  Once I have it working, I'll switch those values to be values from my flow.

 

Oh, I also had to create a new connection for this task.  For Authentication Type I selected Microsoft Entra ID Integrated.  When I signed in, I then got to a screen where I could enter a Base Resource URL (https://onelake.dfs.fabric.microsoft.com/) and a Microsoft Entra ID Resource URI (https://storage.azure.com/).  That connected fine, but maybe I did something wrong here?  I picked those values based on other web pages I read.

 

Does anyone have any suggestions on how to fix the BadRequest?  Or suggestions on the best way to send data from a Power Automate Flow to a Fabric Lakehouse?  Thanks very much!

1 ACCEPTED SOLUTION
svenchio
Solution Supplier
Solution Supplier

Hi @Richtpt  I was reading to your question and the answers you've receive thus far and yes, that is correct, the Lakehouse SQL endpoint is read-only but I want to clarify that, in Power Automate, the "Execute a SQL Query (V2)" action is part of the SQL Server connector, which is designed to interact with SQL-based engines—specifically Microsoft SQL Server or Azure SQL Database. It expects a traditional SQL engine that supports T-SQL syntax and responds to SQL queries over a standard SQL endpoint. 

 

The SQL Query (V2) action is not natively compatible with Microsoft Fabric Lakehouse, even though the Lakehouse provides a SQL analytics endpoint, meaning that just a minoir subset of T-SQL in syntax and behavior would work, and DEFINELTY, NOT WRITES, only reads ... so, if you want to WRITE data to your lakehouse, THE OPTIONS are 

 

 

#1. Write the data to Fabric SQL and

#2, merge (move data from SQL Server to Lakehouse) to Fabric Lakehouse via one of these methods ... 

 

a) Notebooks 

b) Use Data Factory  

c) Use Dataflows Gen2 

 

Perhaps there are other "more creative options",  but this are the ones I recommend because are the "official" and "well-documented" ones. 

 

Why sql? Because is a fully functional/traditional sql environment that will support primary keys, foreing keys, etc. seems like you will need this  (just be carefull that only Entra ID authentication is allowed). 

 

In summary, the answer to your quesiton "Does anyone have any suggestions on how to fix the BadRequest?" is not a bad request but more like IS NOT SUPPOSED TO WORK LIKE THIS  and I provided the suggestion... hopefully you accepted this as the answer based on what I just explained or kudos for the info. provided ... wish you the best of lucks mate! 

 

View solution in original post

6 REPLIES 6
svenchio
Solution Supplier
Solution Supplier

Hi @Richtpt  I was reading to your question and the answers you've receive thus far and yes, that is correct, the Lakehouse SQL endpoint is read-only but I want to clarify that, in Power Automate, the "Execute a SQL Query (V2)" action is part of the SQL Server connector, which is designed to interact with SQL-based engines—specifically Microsoft SQL Server or Azure SQL Database. It expects a traditional SQL engine that supports T-SQL syntax and responds to SQL queries over a standard SQL endpoint. 

 

The SQL Query (V2) action is not natively compatible with Microsoft Fabric Lakehouse, even though the Lakehouse provides a SQL analytics endpoint, meaning that just a minoir subset of T-SQL in syntax and behavior would work, and DEFINELTY, NOT WRITES, only reads ... so, if you want to WRITE data to your lakehouse, THE OPTIONS are 

 

 

#1. Write the data to Fabric SQL and

#2, merge (move data from SQL Server to Lakehouse) to Fabric Lakehouse via one of these methods ... 

 

a) Notebooks 

b) Use Data Factory  

c) Use Dataflows Gen2 

 

Perhaps there are other "more creative options",  but this are the ones I recommend because are the "official" and "well-documented" ones. 

 

Why sql? Because is a fully functional/traditional sql environment that will support primary keys, foreing keys, etc. seems like you will need this  (just be carefull that only Entra ID authentication is allowed). 

 

In summary, the answer to your quesiton "Does anyone have any suggestions on how to fix the BadRequest?" is not a bad request but more like IS NOT SUPPOSED TO WORK LIKE THIS  and I provided the suggestion... hopefully you accepted this as the answer based on what I just explained or kudos for the info. provided ... wish you the best of lucks mate! 

 

Thanks.  That's interesting about the Execute Sql Query, a lot of websites I found suggested doing it that way.  And makes sense the "BadRequest" isn't bad, just not how it's supposed to work.  Would be nice if Microsoft would give a better error message. 😉 

 

I'm not sure I understand what you mean by write the data to Fabric SQL.  You mean some Fabric database?  I know we can pull data from an on-prem SQL Server into a Fabric Lakehouse.  Maybe that's the best way to do this?  

Hi @Richtpt ,
Thanks for reaching out to the Microsoft fabric community forum. 

 

I would also take a moment to thank @svenchio   , for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference.

I hope the below details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you 

 

 

Best Regards, 
Community Support Team  

tayloramy
Community Champion
Community Champion

Hi @Richtpt

 

As @lbendlin  mentioned, Lakehouse SQL Endpoints are read only, so you can't write to them. 

 

You might be able to use the Power Automate HTTP connector to use the REST API to create files in OneLake directly. 

 

Alternatively, you can use the SQL endpoint of a warehouse  and ingest the data into a warehouse, and then if you need it in a lakehouse have a pipeline or copy job move it from the warehouse to the lakehouse. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

I was just starting to look at the HTTP connection several websites mention, I'll go back to that.  Thanks!

lbendlin
Super User
Super User

What is the SQL analytics endpoint for a lakehouse? - Microsoft Fabric | Microsoft Learn

 

SQL Endpoints for Lakehouses are in Read-Only mode.  You need to write to the actual Lakehouse, or to a Fabric SQL Database.

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.