Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello.
I created a lakehouse on Fabric workspace and took the datasets here to the semantic model and visualized them via PowerBI. I added a new column to the delta format transaction table in Lakehouse with Spark notebook and changed the name of an existing column. (isFirstTrx -> FirstTrx). There was no problem before making this change, but after making this change I started receiving the error below and the newly added fields were not transferred to my existing semantic model.
Refresh Error Details :
Power BI Premium Backend Error: An error has occurred while framing the dataset a65e467e-3367-442b-a446-4a60e26ebedd, error: Microsoft.AnalysisServices.OperationException: Failed to save modifications to the server. Error returned: 'We cannot access the source column '<oii>isFirstTrx</oii>' of delta table '<oii>Transaction</oii>' referenced by table '<oii>Transaction</oii>'. Either the source column does not exist, or you don't have access permissions. Consider removing the column reference from the table in the model. Please refer to https://go.microsoft.com/fwlink/?linkid=2248855 for more information. '. at Microsoft.AnalysisServices.Tabular.Model.SaveChangesImpl(SaveContext context) at Microsoft.ASWL.Service.Engine.SeethruAutoSync.SeethruAutoSyncManager.<InvokeFramingAsync>d__38.MoveNext() in /_/ASWL.Service/Engine/SeethruAutoSync/SeethruAutoSyncManager.cs:line 626.
Cluster URI: WABI-WEST-EUROPE-B-PRIMARY-redirect.analysis.windows.net
Activity ID: bea15505-55a5-4cdc-8696-643a0ab27c08
Request ID: bea15505-55a5-4cdc-8696-643a0ab27c08
Time: 2025-03-26 18:00:08Z
Anyone have any thoughts on the subject?
Solved! Go to Solution.
@tolgakurt updating the Semantic Model is the key step after making schema changes in your Fabric Lakehouse.
always Update the Schema in the Semantic Model After Schema Changes, Any time you rename, add, or remove columns in your Lakehouse table, make sure to refresh and update the dataset schema in Power BI.
Thanks,
Prashanth
MS Fabric community support
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query
Hi
I see you're experiencing a DirectLake Auto Refresh error after modifying a delta table schema in your Fabric Lakehouse. Let me explain what's happening and how to fix it.
Based on your description, you:
The error occurs because DirectLake has cached the previous schema of your Transaction table. When you rename or modify columns, the auto sync mechanism can't find the original column names it's expecting ("isFirstTrx"), causing the refresh to fail.
The error message indicates this clearly: "We cannot access the source column 'isFirstTrx' of delta table 'Transaction' referenced by table 'Transaction'."
Here's how to fix this issue:
In some cases, you may need to clear the DirectLake cache to force a fresh schema read. You can do this by running the following command in a Spark notebook connected to your Lakehouse:
spark.sql("CALL lakehouse.system.sync()")This forces Fabric to synchronize the metadata and should resolve schema discrepancy issues after column modifications.
Hi @tolgakurt,
as we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for your issue worked? or let us know if you need any further assistance here?
Thanks,
Prashanth Are
MS Fabric community support
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query
Hi @tolgakurt, as we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for your issue worked? or let us know if you need any further assistance here?
Thanks,
Prashanth Are
MS Fabric community support
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query
Are you referencing that column in Power Query code or in visuals? If so then you need to remove/correct these references.
Yes, I was doing some measure calculations with these columns. But that wasn't the problem. I found the solution while checking the settings in the semantic model. When we make a change in the dataset, we need to update these changes in the semantic model.
Semantic Model -> Open Data Model -> Edit Tables
When we do this and update the model, the error is resolved.
@tolgakurt updating the Semantic Model is the key step after making schema changes in your Fabric Lakehouse.
always Update the Schema in the Semantic Model After Schema Changes, Any time you rename, add, or remove columns in your Lakehouse table, make sure to refresh and update the dataset schema in Power BI.
Thanks,
Prashanth
MS Fabric community support
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query
hi Prashanth, I had a very similar case related to this post. So appreciate if you can give me some advice.
I'm using pipeline to refresh sematic model, and before that will execute a store procedure which have some drop table and re-create table actions, I guess still the 'cache' is making problem, it occasionally encounter refresh failure, says not found table, but it do exist. I was use 'Wait' to let system auto synchronize schema, but it doestn't solve the problem, please do you have any better ideas?
Did you find a solution to this? I'm having the same issue.
Hi PJ, no perfect solution, I've reached out to MS support, they said becasue wh use delta table underlying so drop table is not recommended, then I altered my SP use truncate data instead, kind solved by this way.