Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
We are receiving approx. 2000 events of quite large, complex json every minute from a Google PubSub source, and ingesting it to a Lakehouse delta table via an Eventstream. No transformation steps. We are experiencing some very strange behaviour when the eventstream writes to the tables - it is only writing about 1/3 of the input received to the tables - and it's not because of errors. The runtime logs are empty. I'm providing a picture of the data insights tab below, showcasing the behaviour.
I've tried ingesting the same source to an Eventhouse, and everything runs smoothly there (input = output).
The most spooky thing is that the events just disappear without any error / information.
Why is the Lakehouse not able to consume the input events properly?
Thanks in advance.
Hello @nbj-ksdh ,
Is Eventstream working properly for you? If not, what solution have you used with Fabric?
Thank you
Hi @nbj-ksdh ,
Can you click Refresh and adjust the time zone to see it again? Through my testing, it's output events are higher than input events, I think there may be some delay in transmission. Can you open the table in notebook and then use a query statement to see if the number of rows in the table matches the number of rows in the table in the data source?
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Turn streaming data into instant insights with Microsoft Fabric. Learn to connect live sources, visualize in seconds, and use Copilot + AI for smarter decisions.
Check out the September 2025 Fabric update to learn about new features.