Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Power BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.

Reply
mhanwm
Frequent Visitor

Lakehouse tables are empty after a table schema change and after triggering dataflow refresh

I have a gen2 dataflow that reads data from an on-prem database. I am able to see data in my dataflow. 

I selected a lakehouse as the destination

* update method = replace

* column mapping = auto mapping/allow schema change

Initially I was able to populate the lakehouse with data. I was also able to shortcut data from the lakehouse into a semantic model. 

 

Then I had to add a new column to a table in the lakehouse. So I dropped/recreated/refreshed the lakehouse table because it wasn't picking up the new column. After that all the data in my lakehouse disappeared. I am able to see the number of rows that should be there. But all values are empty. I even checked the underlying files using pyspark and don't see any data at all. Since then, I have not been able to get any data into a lakehouse, which is extremely frustrating. 

 

I even tried creating a smaller dataflow mapped to a new lakehouse to refresh just 1 table. And even then I don't see any data.

 

I did consult the forum, and see people having similar problems but with no real solutions. 

If anyone can give any advice it would be much appreciated.  

1 ACCEPTED SOLUTION

The issue was eventually resolved after I update the view to remove spaces from column names.

The refresh happened more consistently after that. 

Also, if I had to update the table, using spark sql to drop them first worked well compared to if I right-click to delete the table from the interface.  

View solution in original post

7 REPLIES 7
v-nmadadi-msft
Community Support
Community Support

Hi @mhanwm ,
Could you please confirm if the issue has been resolved after raising a support case? If a solution has been found, it would be greatly appreciated if you could share your insights with the community. This would be helpful for other members who may encounter similar issues.

Thank you for your understanding and assistance

The issue was eventually resolved after I update the view to remove spaces from column names.

The refresh happened more consistently after that. 

Also, if I had to update the table, using spark sql to drop them first worked well compared to if I right-click to delete the table from the interface.  

mhanwm
Frequent Visitor

@v-nmadadi-msft 

Yes, I originally tried selecting dynamic schema, but it didn't seem to have picked up the column changes. 


I tried creating a new lakehouse, and re-imported the table again into the new lakehouse and it didn't resolve the issue. The dataflow always shows data, but never in the lakehouse explorer. I always see the correct number of rows but the values are always null.

 

When I used a notebook and spark sql, I was able to confirm that my data exists and is correct. I concluded (based on other discussions) that the issue might have to do with the sql endpoint. I am still trying to resolve the sql endpoint issue. 

Hi @mhanwm ,
Please consider reaching out to Microsoft Support. You can provide them with all the troubleshooting steps you've already taken, which will help them understand the issue better and provide a resolution. They might be able to identify something specific about your admin account setup or provide a solution that isn't immediately obvious. 

Below is the link to create Microsoft Support ticket:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS. Feel free to reach out if you need further assistance.
Thank you

v-nmadadi-msft
Community Support
Community Support

Hi @mhanwm,
Thanks for reaching out to the Microsoft fabric community forum.

Please ensure you have selected dynamic schema and not fixed schema in your options while configuring the data source destination. It is essential that the option is selected so as to add new columns in the table.

After schema changes, review the column mapping in your Dataflow to ensure new columns are correctly mapped to the Lakehouse table.

Additionally try creating a new table in the Lakehouse as the destination to see if that resolves the issue. This can help determine whether the problem is specific to the original table or related to a broader schema or metadata inconsistency.

 


If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS. Feel free to reach out if you need further assistance.
Thanks and Regards


mhanwm
Frequent Visitor

Thanks @johnbasha33 I'm quite new to this so would appreciate any further advice on resolving this. 

 

When I created a new column in the source table, I saw the new column in the dataflow, refreshed it, published the dataflow and clicked to refresh it. But I didn't see the new column in the lakehouse table. so i eventually dropped the table and recreated it. 

 

Would you recommend not using auto-mapping/schema detection? If so, how would I manually create the column in the lakehouse? I tried using pyspark, but have not been successful yet. 

What would the alternative to replace be? I didn't think append would be the right option. 

How do I reset table metadata? is that something I can do manually? 

 

johnbasha33
Super User
Super User

@mhanwm 

When you manually drop and recreate tables in a lakehouse, the physical and metadata link between the Dataflow and Lakehouse table can break — especially if:

  • Auto-mapping or schema detection gets confused,

  • You used "Replace" mode after schema has changed,

  • Table metadata wasn't properly reset after manual changes.

This causes symptoms like:

  • Table shows row count but data is empty,

  • Files may exist in the Delta folder structure, but contain no usable data,

Helpful resources

Announcements
June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.