Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi,
Has anyone ever implemented a use case to trigger the receipt of an email and launch a Power Automate flow to move the attached Excel file to a lakehouse within Fabric?
I want Fabric to be the destination, but I want to manage that trigger with Power Automate, and from there dump the Excel file to Fabric. I can do this if I use Sharepoint as an stagging repository before dumping it to Fabric, but I would like to know if there is a direct way to dump the Excel file from the email to Fabric.
Thanks.
Hi @amaaiia
Thank you for reaching out to the Microsoft Fabric Forum Community.
@OnurOz @Ugk161610 Thanks for the inputs.
I hope the information provided by users was helpful. If you still have questions, please don't hesitate to reach out to the community.
Hi @amaaiia ,
Yes, this scenario is possible, and a few teams I’ve worked with are doing it already — but right now there is no direct action in Power Automate that writes an email attachment straight into a Fabric Lakehouse. That connector doesn’t exist yet. Because of that, every working solution today uses a small “bridge” before the file lands in Fabric.
SharePoint works, as you already found, but if you want to avoid SharePoint, the simplest and cleanest approach we’ve used is:
Send the attachment to a Storage account first, then let Fabric take over from there.
Power Automate can natively write files into ADLS Gen2 using “Create file (Blob Storage),” and once the file is in storage, you have full control in Fabric — pipelines, Dataflows Gen2, or even Autoloader-style ingestion if you’re using shortcuts or watchers.
The nice thing is:
Power Automate → Storage → Fabric
is fast, reliable, and avoids having to maintain SharePoint folders just for staging.
At the moment, there’s just no “write to Lakehouse” connector, so going through Storage is the closest to a direct path.
If Microsoft adds a native Lakehouse connector in Power Automate, this flow will get much simpler. But for now, storage is the most practical and production-safe option.
If you want, I can share a small example Power Automate flow we use for email-to-storage ingestion.
– Gopi Krishna
Hi amaaiia,
if an instant trigger is optional, you can poll the inbox frequently (say every 5 mins) using the Power Query Exchange Connector within Dataflows Gen 2. This connects directly to your Exchange/Outlook inbox, filters emails by sender, subject, folder path, expands attachments (Excel, CSV, etc.), loads directly into Lakehouse tables.
If you'd like to stick with Power Automate to be the trigger, there's currently no native Fabric Lakehouse connector in Power Automate. However, there are workable patterns:
Trigger: "When a new email arrives" in Power Automate
Get attachment content
Use HTTP with Microsoft Entra ID (preauthorized) premium connector
Call OneLake REST API to write file directly to Lakehouse Files area
Connection requires two URLs:
Base resource URL: https://storage.fabric.microsoft.com
Microsoft Entra ID resource URL: https://storage.fabric.microsoft.com
This writes the file to Lakehouse Files, not directly to tables. You'd still need a subsequent Fabric pipeline or notebook to parse Excel → tables.
Use Power Automate "Send Event" action with Event Hub protocol
Send attachment metadata/content to Fabric Eventstream
Eventstream routes to Lakehouse
Best for streaming scenarios, less ideal for file attachments
Power Automate saves attachment to SharePoint (as you mentioned)
Fabric Dataflow/Pipeline picks up from SharePoint
This is the most common pattern because:
SharePoint connector is native and free
Provides audit trail and versioning
Simplifies error handling
Btw, there is already an active idea/suggestion submitted to fabric team, details are here: https://community.fabric.microsoft.com/t5/Fabric-Ideas/Email-data-directly-into-a-lakehouse/idi-p/47...
I suggest you vote for this too.
Hope that helps
Onur
😊If this post helped you, feel free to give it some Kudos! 👍
✅And if it answered your question, please mark it as the accepted solution.
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!