Don't miss your chance to take exam DP-600 or DP-700 on us!
Request nowFabric Data Days Monthly is back. Join us on March 26th for two expert-led sessions on 1) Getting Started with Fabric IQ and 2) Mapping & Spacial Analytics in Fabric. Register now
Hi,
Has anyone ever implemented a use case to trigger the receipt of an email and launch a Power Automate flow to move the attached Excel file to a lakehouse within Fabric?
I want Fabric to be the destination, but I want to manage that trigger with Power Automate, and from there dump the Excel file to Fabric. I can do this if I use Sharepoint as an stagging repository before dumping it to Fabric, but I would like to know if there is a direct way to dump the Excel file from the email to Fabric.
Thanks.
Solved! Go to Solution.
Hi amaaiia,
if an instant trigger is optional, you can poll the inbox frequently (say every 5 mins) using the Power Query Exchange Connector within Dataflows Gen 2. This connects directly to your Exchange/Outlook inbox, filters emails by sender, subject, folder path, expands attachments (Excel, CSV, etc.), loads directly into Lakehouse tables.
If you'd like to stick with Power Automate to be the trigger, there's currently no native Fabric Lakehouse connector in Power Automate. However, there are workable patterns:
Trigger: "When a new email arrives" in Power Automate
Get attachment content
Use HTTP with Microsoft Entra ID (preauthorized) premium connector
Call OneLake REST API to write file directly to Lakehouse Files area
Connection requires two URLs:
Base resource URL: https://storage.fabric.microsoft.com
Microsoft Entra ID resource URL: https://storage.fabric.microsoft.com
This writes the file to Lakehouse Files, not directly to tables. You'd still need a subsequent Fabric pipeline or notebook to parse Excel → tables.
Use Power Automate "Send Event" action with Event Hub protocol
Send attachment metadata/content to Fabric Eventstream
Eventstream routes to Lakehouse
Best for streaming scenarios, less ideal for file attachments
Power Automate saves attachment to SharePoint (as you mentioned)
Fabric Dataflow/Pipeline picks up from SharePoint
This is the most common pattern because:
SharePoint connector is native and free
Provides audit trail and versioning
Simplifies error handling
Btw, there is already an active idea/suggestion submitted to fabric team, details are here: https://community.fabric.microsoft.com/t5/Fabric-Ideas/Email-data-directly-into-a-lakehouse/idi-p/47...
I suggest you vote for this too.
Hope that helps
Onur
😊If this post helped you, feel free to give it some Kudos! 👍
✅And if it answered your question, please mark it as the accepted solution.
Hi @amaaiia ,
Yes, this scenario is possible, and a few teams I’ve worked with are doing it already — but right now there is no direct action in Power Automate that writes an email attachment straight into a Fabric Lakehouse. That connector doesn’t exist yet. Because of that, every working solution today uses a small “bridge” before the file lands in Fabric.
SharePoint works, as you already found, but if you want to avoid SharePoint, the simplest and cleanest approach we’ve used is:
Send the attachment to a Storage account first, then let Fabric take over from there.
Power Automate can natively write files into ADLS Gen2 using “Create file (Blob Storage),” and once the file is in storage, you have full control in Fabric — pipelines, Dataflows Gen2, or even Autoloader-style ingestion if you’re using shortcuts or watchers.
The nice thing is:
Power Automate → Storage → Fabric
is fast, reliable, and avoids having to maintain SharePoint folders just for staging.
At the moment, there’s just no “write to Lakehouse” connector, so going through Storage is the closest to a direct path.
If Microsoft adds a native Lakehouse connector in Power Automate, this flow will get much simpler. But for now, storage is the most practical and production-safe option.
If you want, I can share a small example Power Automate flow we use for email-to-storage ingestion.
– Gopi Krishna
Hi @amaaiia
Thank you for reaching out to the Microsoft Fabric Forum Community.
@OnurOz @Ugk161610 Thanks for the inputs.
I hope the information provided by users was helpful. If you still have questions, please don't hesitate to reach out to the community.
Hi @amaaiia
Hope everything’s going smoothly on your end. I wanted to check if the issue got sorted. if you have any other issues please reach community.
Hi @amaaiia ,
Yes, this scenario is possible, and a few teams I’ve worked with are doing it already — but right now there is no direct action in Power Automate that writes an email attachment straight into a Fabric Lakehouse. That connector doesn’t exist yet. Because of that, every working solution today uses a small “bridge” before the file lands in Fabric.
SharePoint works, as you already found, but if you want to avoid SharePoint, the simplest and cleanest approach we’ve used is:
Send the attachment to a Storage account first, then let Fabric take over from there.
Power Automate can natively write files into ADLS Gen2 using “Create file (Blob Storage),” and once the file is in storage, you have full control in Fabric — pipelines, Dataflows Gen2, or even Autoloader-style ingestion if you’re using shortcuts or watchers.
The nice thing is:
Power Automate → Storage → Fabric
is fast, reliable, and avoids having to maintain SharePoint folders just for staging.
At the moment, there’s just no “write to Lakehouse” connector, so going through Storage is the closest to a direct path.
If Microsoft adds a native Lakehouse connector in Power Automate, this flow will get much simpler. But for now, storage is the most practical and production-safe option.
If you want, I can share a small example Power Automate flow we use for email-to-storage ingestion.
– Gopi Krishna
Hi amaaiia,
if an instant trigger is optional, you can poll the inbox frequently (say every 5 mins) using the Power Query Exchange Connector within Dataflows Gen 2. This connects directly to your Exchange/Outlook inbox, filters emails by sender, subject, folder path, expands attachments (Excel, CSV, etc.), loads directly into Lakehouse tables.
If you'd like to stick with Power Automate to be the trigger, there's currently no native Fabric Lakehouse connector in Power Automate. However, there are workable patterns:
Trigger: "When a new email arrives" in Power Automate
Get attachment content
Use HTTP with Microsoft Entra ID (preauthorized) premium connector
Call OneLake REST API to write file directly to Lakehouse Files area
Connection requires two URLs:
Base resource URL: https://storage.fabric.microsoft.com
Microsoft Entra ID resource URL: https://storage.fabric.microsoft.com
This writes the file to Lakehouse Files, not directly to tables. You'd still need a subsequent Fabric pipeline or notebook to parse Excel → tables.
Use Power Automate "Send Event" action with Event Hub protocol
Send attachment metadata/content to Fabric Eventstream
Eventstream routes to Lakehouse
Best for streaming scenarios, less ideal for file attachments
Power Automate saves attachment to SharePoint (as you mentioned)
Fabric Dataflow/Pipeline picks up from SharePoint
This is the most common pattern because:
SharePoint connector is native and free
Provides audit trail and versioning
Simplifies error handling
Btw, there is already an active idea/suggestion submitted to fabric team, details are here: https://community.fabric.microsoft.com/t5/Fabric-Ideas/Email-data-directly-into-a-lakehouse/idi-p/47...
I suggest you vote for this too.
Hope that helps
Onur
😊If this post helped you, feel free to give it some Kudos! 👍
✅And if it answered your question, please mark it as the accepted solution.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Fabric update to learn about new features.
| User | Count |
|---|---|
| 15 | |
| 7 | |
| 4 | |
| 3 | |
| 3 |