Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hi Team, We are trying to load data from a flat file using on-premise data gateway into fabric lakehouse using data flow gen 2. But we are getting only around 54 lakhs + records in the destination table and remaining time the data flow refresh keeps on clocking/refreshing most of the time. When we check the destination, it does not show entire data. Could you please fix the issue or let us know is there any workaround for this.
thank you for your response, let me go through the document which you shared with me
Following up to see you on the last response and was just checking back to see if you have a resolution yet.
In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.
Thanks.
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. In case if you have any resolution please do share that same with the community as it can be helpful to others.
If you have any question relating to the current thread, please do let us know and we will try out best to help you.
In case if you have any other question on a different issue, we request you to open a new thread.
Thanks.
Thanks for using Microsoft Fabric Community.
Apologies for the issue that you are facing here.
As I understand that you are trying to load data from a flat file using on-premise data gateway into fabric lakehouse using data flow gen 2. But most of the time dataflow refresh keeps on clocking/refreshing and not able to see the entire data in the destination.
When using Microsoft Fabric Dataflow Gen2 with an on-premises data gateway, you might encounter issues with the dataflow refresh process. The underlying problem occurs when the gateway is unable to connect to the dataflow staging Lakehouse in order to read the data before copying it to the desired data destination. This issue can occur regardless of the type of data destination being used.
You can refer to the document for more details.
I hope this information helps. Please do let us know if you have any further questions.
Thanks.