Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
Hi ,
I am updating the datatype of column name year ( from string to int). The change is getting reflected in my lakeshouse table but not in my default dataset.
I am geting error when i updating dataset of the powerBI?
The column type change from 'String' to 'Int64' is not allowed for Direct Lake table column '<ccon>Year</ccon>'['<ccon>fact_cost_monthly</ccon>']. Please choose a compatible data type or exclude the column. See https://go.microsoft.com/fwlink/?linkid=2215281 to learn more.
In Default Dataset in fabric ( it is still string) :
@priyanksingh This behaviour has been seen at multiple times , Hoping MS has this in their issues backlog !
From data engineeering prespetive delta lake files does support schema changes to certain extent !
but sometimes we have to rewrite the whole table back to Lakehouse and thats why schema binding is very important when designing lakehouses
In perfect world indeed we expect the dataset should reflect the underlying schema change from the Lakehouse,
Feel free to create support ticket on the same and also update us if you got any feedback from suuport that seems relevant for us as well
reference :
https://community.fabric.microsoft.com/t5/General-Discussion/Updating-Power-BI-Dataset-Metadata-Conn...
Thanks for the feedback. I have a similar issue, but it seems once in this scenario it is impossible to back out of it.
I have deleted the offending table from the Lakehouse, however it still appears to create a conflict despite it not existing, and prevents me from refreshing the Semantic Model.
Very frustrating, and not clear if there is a path to rescue a Model in this state. I don't mind recreating tables, but don't want to have to start the whole Model/Lakehouse again.
In my experience so far in the Lakehouse, any schema changes require the Lakehouse table to be deleted and re-created in order for the change to be reflected downstream in the dataset. If you change the schema, the change is not reflected in the default dataset. The modeling interface still shows the old schema. You do not see any errors until you run a query against the dataset.
We dont need to delete and recreate the entire table from lakehouse. you can just remove from the default dataset and add it back
But that should be fixed as my table contains millions of record. Everytime for any small change and update, we cannot drop and create table.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.