Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
We are receiving approx. 2000 events of quite large, complex json every minute from a Google PubSub source, and ingesting it to a Lakehouse delta table via an Eventstream. No transformation steps. We are experiencing some very strange behaviour when the eventstream writes to the tables - it is only writing about 1/3 of the input received to the tables - and it's not because of errors. The runtime logs are empty. I'm providing a picture of the data insights tab below, showcasing the behaviour.
I've tried ingesting the same source to an Eventhouse, and everything runs smoothly there (input = output).
The most spooky thing is that the events just disappear without any error / information.
Why is the Lakehouse not able to consume the input events properly?
Thanks in advance.
Hello @nbj-ksdh ,
Is Eventstream working properly for you? If not, what solution have you used with Fabric?
Thank you
Hi @nbj-ksdh ,
Can you click Refresh and adjust the time zone to see it again? Through my testing, it's output events are higher than input events, I think there may be some delay in transmission. Can you open the table in notebook and then use a query statement to see if the number of rows in the table matches the number of rows in the table in the data source?
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.