Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hi, I have scheduled an Invoke Pipeline(preview) which gets data from OnPrem SQLServer to Warehouse via Lakehouse and build further transformations/Analytics on the warehouse table.
I observed multiple times that after successful completion of Invoke pipeline warehouse/destination table is not getting reflected with new data(new records aren't getting inserted/updated & count between OnPrem table and warehouse table is not matching which should match ideally). And when I manually ran the pipeline which the invoke pipeline triggers new data gets in. This is weird behaviour from fabric is it expected or is there something wrong from my side?
And also today I observed that the lakehouse table which is used as staging didn't even get a single record and we handle source deletes for this pipeline and the whole table got deleted (If this is the case it's hard to get back whole data which we ran/got in past 30-40 days).But When checked json outputs of the pipeline run history it shows the datawritten/dataread counts but its not getting reflected.
Thanks for you response/help in advance.
Hi @ESharathChandra ,
We wanted to check whether the issue has been resolved with the help of our support team. If it has, we kindly ask you to share the solution here and mark it as the accepted answer so that other users can benefit as well.
If you still have any questions or need further assistance, please don’t hesitate to let us know. We’re more than happy to continue supporting you.
Thank you for your patience. We look forward to hearing from you.
Best regards,
Chaithra E.
Hi @ESharathChandra ,
As we haven’t heard back from you, so just following up to our previous message. I'd like to confirm if you've successfully resolved this issue or if you need further help.
If yes, you are welcome to share your workaround and mark it as a solution so that other users can benefit as well. If you find a reply particularly helpful to you, you can also mark it as a solution.
If you still have any questions or need more support, please feel free to let us know. We are more than happy to continue to help you.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Chaithra E.
Hi @v-echaithra , Thanks for your response. We have raised the issue with Microsoft Support and yet to hear back from MS Support. But we haven't encountered it again.
After we hear back the solution from MS Support will share it here.
Thanks & Regards,
Eravelli Sharath Chandra
Hi @ESharathChandra ,
May I ask if you have gotten this issue resolved by support?
If it is solved, please mark the helpful reply or share your solution and accept it as solution, it will be helpful for other members of the community who have similar problems as yours to solve it faster.
Regards,
Chaithra.
Hii @ESharathChandra ,
Thanks for your response. If your issue still persists, please consider raising a support ticket for further assistance.
To raise a support ticket for Fabric and Power BI, kindly follow the steps outlined in the following guide:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn
Best Regards,
Chaithra.
Hi @ESharathChandra ,
Thank you for reaching out to Microsoft Community.
If you suspect Lakehouse table deletion, check for any activity with Delete or logic.
Previous versions of the on-premises data gateway can cause issues when Dataflow Gen2 loads data to a lakehouse. The issue occurs when the lakehouse table has column names spaces. The result loads a table consisting of all null values.
You can solve this issue by upgrading your gateway version to 3000.266.4 or later. Try a simple copy of a single record and trace it through the same pipeline path.
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Regards,
Chaithra.
Thank you @v-echaithra for the response.
In our case we are using copy data activity as part of pipeline to copy data from OnPrem to Lakehouse. The lakehouse table overwrites daily.
Recently it's observed that the no data came into lakehouse using copy data but OnPrem has data and pipeline ouput json says rows are read and copied(Below screenshot of pipeline's json output).
Same issue I am also facing the delay
Hi @ESharathChandra , This could be because of the SQL Endpoint delays (known issue).
If you manually refresh the metadata sync, does it reflect the new data?
If yes, you could try triggering the SQL endpoint programatically as mentioned Refresh SQL analytics endpoint Metadata REST API (Preview) | Microsoft Fabric Blog | Microsoft Fabri...
One way is to invoke this script given https://github.com/microsoft/fabric-toolbox/blob/main/samples/notebook-refresh-tables-in-sql-endpoin... as a notebook activity after your load job is finished.
Thanks for the response @Gpop13 , No the new data didn't even enter into lakehouse and the lakehouse table was empty today.
Metadata sync issue is known one. I encountered it multiple times when I get data from OnPrem Sql server to lakehouse using Copy data activity and then load into warehouse table. In this case the data gets into lakehouse but due to sync issue new data doesn't get into warehouse.
But Today in my case the copy data activity which gets data from OnPrem to lakehouse ran and json output when checked had rows copied as well. But after completion of pipeline the lakehouse table itself is empty.