Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
I'm facing an issue where the data only goes to the Lakehouse when I click the "Publish Now" button in Dataflow GEN2.
When I try to schedule a refresh or even force a refresh, the dataflow gets updated, but the Lakehouse does not. I can only update it by clicking "Publish Now."
Ideally, 7 rows should appear, but only 5 are showing up (when it was published by clicking the button).
In the Dataflow, everything appears correctly, but even after the refresh, the Lakehouse still shows only 5 rows.
Solved! Go to Solution.
Hi @marcosbajr
Thank you for your reply. Here are some steps you can take to troubleshoot the problem with the data not being sent to the Lakehouse:
1. Check your Dataflow settings to ensure that the Lakehouse destination is configured correctly. Confirm that it is targeting the correct table or you can get errors like in below screenshot.
2. In the Dataflow settings, verify that the automatic refresh schedule is set up properly. Make sure it's configured to run at appropriate intervals and that the correct time zone is applied.
3. Run a SELECT * FROM <table_name> query in your Notebook to see what data is currently in the Lakehouse table. This will help you confirm the number of records and which specific records are stored.
4. If the issue persists after checking the above, consider rebuilding the Dataflow from scratch. This can sometimes resolve hidden configuration issues.
If this post is helpful, please consider marking it as a solution so that other members can find it more easily.
Thank you.
Hi @marcosbajr
Thanks for reaching out fabric community forum.
I have ingested sample data of 10 rows in data flow den2.
I have done some transformations and published in Lakehouse.
Set up a schedule to refresh the data, and it was successfully refreshed with 10 rows of correct data.
So, as per your issues can you follow these steps to resolve your qurey
1. Ensure that all necessary ports (e.g., 1433) and endpoints are whitelisted in your network setting.
2. Verify that the schema and data types in your Dataflow GEN2 match those in the Lakehouse. If there are any mismatches, try updating the schema or creating a new table.
3. If you're using the FastCopy feature, try disabling it to see if it resolves the issue.
If this post helps, please consider accept it as the solution so other members can find it more quickly.
Hope this helps!
Thank you.
Unfortunately, the issue persists. The data is only sent to the Lakehouse when I click on "Publish Now." When I schedule a refresh, the data is updated only in GEN2, while the Lakehouse remains unchanged.
The fast copy feature is disabled.
When I use the DESCRIBE HISTORY command in the Notebook, it states that the table has been replaced, but the updated data does not appear.
Hi @marcosbajr
Thank you for your reply. Here are some steps you can take to troubleshoot the problem with the data not being sent to the Lakehouse:
1. Check your Dataflow settings to ensure that the Lakehouse destination is configured correctly. Confirm that it is targeting the correct table or you can get errors like in below screenshot.
2. In the Dataflow settings, verify that the automatic refresh schedule is set up properly. Make sure it's configured to run at appropriate intervals and that the correct time zone is applied.
3. Run a SELECT * FROM <table_name> query in your Notebook to see what data is currently in the Lakehouse table. This will help you confirm the number of records and which specific records are stored.
4. If the issue persists after checking the above, consider rebuilding the Dataflow from scratch. This can sometimes resolve hidden configuration issues.
If this post is helpful, please consider marking it as a solution so that other members can find it more easily.
Thank you.
Did you fund any errors in the dataflow refresh history? if yes then please do share them.
Also, reconfigure the destination settings of your query and see if that helps.
Need a Power BI Consultation? Hire me on Upwork
Connect on LinkedIn
|
The dataflow refreshes normally. I’ve already tried reconfiguring the destination, but the Lakehouse still doesn’t update.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.