Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
We are receiving approx. 2000 events of quite large, complex json every minute from a Google PubSub source, and ingesting it to a Lakehouse delta table via an Eventstream. No transformation steps. We are experiencing some very strange behaviour when the eventstream writes to the tables - it is only writing about 1/3 of the input received to the tables - and it's not because of errors. The runtime logs are empty. I'm providing a picture of the data insights tab below, showcasing the behaviour.
I've tried ingesting the same source to an Eventhouse, and everything runs smoothly there (input = output).
The most spooky thing is that the events just disappear without any error / information.
Why is the Lakehouse not able to consume the input events properly?
Thanks in advance.
Hello @nbj-ksdh ,
Is Eventstream working properly for you? If not, what solution have you used with Fabric?
Thank you
Hi @nbj-ksdh ,
Can you click Refresh and adjust the time zone to see it again? Through my testing, it's output events are higher than input events, I think there may be some delay in transmission. Can you open the table in notebook and then use a query statement to see if the number of rows in the table matches the number of rows in the table in the data source?
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Turn streaming data into instant insights with Microsoft Fabric. Learn to connect live sources, visualize in seconds, and use Copilot + AI for smarter decisions.