Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
I need to ingest data from a Kafka stream to my fabric lakehouse
What are my options in fabric (even stream, event hub, kql, data factory)?
Anyone who can share some experiences?
Hi @joakimfenno ,
First, you can create an eventstream. Then, select "Enhanced Capabilities (preview)," and click on "external source" to choose a source type. Next, add a Lakehouse as the destination. Finally, click "publish."
learn more:
Hi @joakimfenno ,
You can do this by creating an event stream and then setting the destination to lakehouse.
I found a blog that I would like to refer to Building a simple data lake / DW solution in MS Fabric (Part 4) – Brian's Tech Blog (wordpress.com)
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Turn streaming data into instant insights with Microsoft Fabric. Learn to connect live sources, visualize in seconds, and use Copilot + AI for smarter decisions.
Check out the September 2025 Fabric update to learn about new features.