The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredAsk the Fabric Databases & App Development teams anything! Live on Reddit on August 26th. Learn more.
We are receiving approx. 2000 events of quite large, complex json every minute from a Google PubSub source, and ingesting it to a Lakehouse delta table via an Eventstream. No transformation steps. We are experiencing some very strange behaviour when the eventstream writes to the tables - it is only writing about 1/3 of the input received to the tables - and it's not because of errors. The runtime logs are empty. I'm providing a picture of the data insights tab below, showcasing the behaviour.
I've tried ingesting the same source to an Eventhouse, and everything runs smoothly there (input = output).
The most spooky thing is that the events just disappear without any error / information.
Why is the Lakehouse not able to consume the input events properly?
Thanks in advance.
Hello @nbj-ksdh ,
Is Eventstream working properly for you? If not, what solution have you used with Fabric?
Thank you
Hi @nbj-ksdh ,
Can you click Refresh and adjust the time zone to see it again? Through my testing, it's output events are higher than input events, I think there may be some delay in transmission. Can you open the table in notebook and then use a query statement to see if the number of rows in the table matches the number of rows in the table in the data source?
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.