Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
marcosbajr
Frequent Visitor

The Lakehouse does not update when I refresh the Dataflow GEN2

I'm facing an issue where the data only goes to the Lakehouse when I click the "Publish Now" button in Dataflow GEN2.

When I try to schedule a refresh or even force a refresh, the dataflow gets updated, but the Lakehouse does not. I can only update it by clicking "Publish Now."

Ideally, 7 rows should appear, but only 5 are showing up (when it was published by clicking the button).

marcosbajr_0-1733796454517.png

 

marcosbajr_1-1733796466958.png

 

In the Dataflow, everything appears correctly, but even after the refresh, the Lakehouse still shows only 5 rows.

marcosbajr_3-1733796556934.png

marcosbajr_0-1733796822961.png

 

 

marcosbajr_4-1733796614702.png

 

 

 

1 ACCEPTED SOLUTION

Hi @marcosbajr 

Thank you for your reply. Here are some steps you can take to troubleshoot the problem with the data not being sent to the Lakehouse:

1. Check your Dataflow settings to ensure that the Lakehouse destination is configured correctly. Confirm that it is targeting the correct table or you can get errors like in below screenshot.

vssriganesh_0-1733895434674.png

 

2. In the Dataflow settings, verify that the automatic refresh schedule is set up properly. Make sure it's configured to run at appropriate intervals and that the correct time zone is applied.

3. Run a SELECT * FROM <table_name> query in your Notebook to see what data is currently in the Lakehouse table. This will help you confirm the number of records and which specific records are stored.

4. If the issue persists after checking the above, consider rebuilding the Dataflow from scratch. This can sometimes resolve hidden configuration issues.

If this post is helpful, please consider marking it as a solution so that other members can find it more easily.

Thank you.



View solution in original post

5 REPLIES 5
v-ssriganesh
Community Support
Community Support

Hi @marcosbajr 
Thanks for reaching out fabric community forum.

I have ingested sample data of 10 rows in data flow den2.

vssriganesh_0-1733829655865.png


I have done some transformations and published in Lakehouse.

vssriganesh_1-1733829691198.pngvssriganesh_2-1733829709697.png


Set up a schedule to refresh the data, and it was successfully refreshed with 10 rows of correct data.

vssriganesh_3-1733829749171.pngvssriganesh_4-1733829765687.png


So, as per your issues can you follow these steps to resolve your qurey
1. Ensure that all necessary ports (e.g., 1433) and endpoints are whitelisted in your network setting.

2. Verify that the schema and data types in your Dataflow GEN2 match those in the Lakehouse. If there are any mismatches, try updating the schema or creating a new table.
3. If you're using the FastCopy feature, try disabling it to see if it resolves the issue.

If this post helps, please consider accept it as the solution so other members can find it more quickly.

Hope this helps!

Thank you.








Unfortunately, the issue persists. The data is only sent to the Lakehouse when I click on "Publish Now." When I schedule a refresh, the data is updated only in GEN2, while the Lakehouse remains unchanged.

The fast copy feature is disabled.

When I use the DESCRIBE HISTORY command in the Notebook, it states that the table has been replaced, but the updated data does not appear.

Hi @marcosbajr 

Thank you for your reply. Here are some steps you can take to troubleshoot the problem with the data not being sent to the Lakehouse:

1. Check your Dataflow settings to ensure that the Lakehouse destination is configured correctly. Confirm that it is targeting the correct table or you can get errors like in below screenshot.

vssriganesh_0-1733895434674.png

 

2. In the Dataflow settings, verify that the automatic refresh schedule is set up properly. Make sure it's configured to run at appropriate intervals and that the correct time zone is applied.

3. Run a SELECT * FROM <table_name> query in your Notebook to see what data is currently in the Lakehouse table. This will help you confirm the number of records and which specific records are stored.

4. If the issue persists after checking the above, consider rebuilding the Dataflow from scratch. This can sometimes resolve hidden configuration issues.

If this post is helpful, please consider marking it as a solution so that other members can find it more easily.

Thank you.



tharunkumarRTK
Super User
Super User

@marcosbajr 

 

Did you fund any errors in the dataflow refresh history? if yes then please do share them.

 

Also, reconfigure the destination settings of your query and see if that helps. 

 

Need a Power BI Consultation? Hire me on Upwork

 

 

 

Connect on LinkedIn

 

 

 








Did I answer your question? Mark my post as a solution!
If I helped you, click on the Thumbs Up to give Kudos.

Proud to be a Super User!


PBI_SuperUser_Rank@2x.png

marcosbajr_0-1733798687367.png

The dataflow refreshes normally. I’ve already tried reconfiguring the destination, but the Lakehouse still doesn’t update.

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors