Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
priyanksingh
New Member

Default dataset of the lakehouse is not getting updated

Hi ,

 

I am updating the datatype of column name year ( from string to int). The change is getting reflected in my lakeshouse table but not in my default dataset.

 

I am geting error when i updating dataset of the powerBI?

 

Unable to update BI model with these changes. Please try again later or contact support.

The column type change from 'String' to 'Int64' is not allowed for Direct Lake table column '<ccon>Year</ccon>'['<ccon>fact_cost_monthly</ccon>']. Please choose a compatible data type or exclude the column. See https://go.microsoft.com/fwlink/?linkid=2215281 to learn more.

Please try again later or contact support. If you contact support, please provide these details.
 
 
Datatype in my lakeshouse :
priyanksingh_0-1693472057648.png

 

 

In Default Dataset in fabric ( it is still string) :

priyanksingh_1-1693472162467.png

 

6 REPLIES 6
puneetvijwani
Resolver IV
Resolver IV

@priyanksingh This behaviour has been seen at multiple times  , Hoping MS has this in their issues backlog !

From data engineeering prespetive delta lake files does support schema changes to certain extent !
but sometimes we have to rewrite the whole table back to Lakehouse and thats why schema binding is very important when designing lakehouses 

In perfect world indeed we expect the dataset should reflect the underlying schema change from the Lakehouse,

Feel free to create support ticket on the same and also update us if you got any feedback from suuport that seems relevant for us as well 
 
reference :

https://community.fabric.microsoft.com/t5/General-Discussion/Updating-Power-BI-Dataset-Metadata-Conn...

Thanks for the feedback. I have a similar issue, but it seems once in this scenario it is impossible to back out of it.

I have deleted the offending table from the Lakehouse, however it still appears to create a conflict despite it not existing, and prevents me from refreshing the Semantic Model.

 

Very frustrating, and not clear if there is a path to rescue a Model in this state. I don't mind recreating tables, but don't want to have to start the whole Model/Lakehouse again.

Anonymous
Not applicable

In my experience so far in the Lakehouse, any schema changes require the Lakehouse table to be deleted and re-created in order for the change to be reflected downstream in the dataset.  If you change the schema, the change is not reflected in the default dataset.  The modeling interface still shows the old schema.  You do not see any errors until you run a query against the dataset. 

We dont need to delete and recreate the entire table from lakehouse. you can just remove from the default dataset and add it back

@priyanksingh  how are you removing the default dataset ??

But that should be fixed as my table contains millions of record. Everytime for any small change and update, we cannot drop and create table.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.