Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
I need to ingest data from a Kafka stream to my fabric lakehouse
What are my options in fabric (even stream, event hub, kql, data factory)?
Anyone who can share some experiences?
Hi @joakimfenno ,
First, you can create an eventstream. Then, select "Enhanced Capabilities (preview)," and click on "external source" to choose a source type. Next, add a Lakehouse as the destination. Finally, click "publish."
learn more:
Hi @joakimfenno ,
You can do this by creating an event stream and then setting the destination to lakehouse.
I found a blog that I would like to refer to Building a simple data lake / DW solution in MS Fabric (Part 4) – Brian's Tech Blog (wordpress.com)
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.