Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
I am struggling to get Optimial Refresh working in a Materialized View. It's always doing a Full Refresh.
Source Data
I created a table in my lake as delta format, with CDF enablerd, snippets below:
df.write.format("delta") \
.option("delta.enableChangeDataFeed", "true") \
.mode("append") \
.save(dest_table_path)
And that table is updated with a merge, snippet below:
delta_table.alias("dest").merge(
changed_df.alias("src"),
"dest.id = src.id"
).whenMatchedUpdateAll(condition="dest._ts <> src._ts") \
.whenNotMatchedInsertAll() \
.execute()
And I see in the Files area of that source table a "_change_data" folder, which correctly shows indivudal records that I add/update with my merge statement whenever it runs. So all of that seems good to me.
Materialized View
In the Materialized View management page, I have "Optimal Refresh" enabled.
Then I create a materialized view like this:
CREATE MATERIALIZED LAKE VIEW dbo.gold_users AS
SELECT
u.id,
u.organizationId,
u.firstName,
u.lastName,
u._ts
FROM
dbo.bronze_users u
That works, the table is created with all of the expected data.
Performing Refreshes
When I perform a refresh of the materialized views, this view is updated with the newest data which i can confirm with a query. All that seems good.
However, according to the metrics history RefreshPolicy column, it's always doing a "FullRefresh" on this view.
I have tried to perform the materialized view refresh in two ways: Using the RUN button in the materialized view management page, and also using this command to refresh the single view: REFRESH MATERIALIZED VIEW dbo.gold_users.
Either way it doesn't seem to want to do the optrimal refresh. What am I doing wrong?
Solved! Go to Solution.
Hi @EricCarlson ,
what you are seeing is expected. Your materialized view only refreshes incrementally for new rows. When existing rows are updated, Fabric cannot detect those changes with the current setup, so it performs a full refresh.To make updates refresh incrementally, the view must include Delta’s change-tracking metadata. If you don’t want those fields exposed, use a two-step approach: a staging view with change tracking, and a clean view for reporting.This ensures incremental refresh works for both new and updated rows.
Thank you.
Hi @EricCarlson ,
May I ask if you have resolved this issue? Please let us know if you have any further issues, we are happy to help.
Thank you.
Hi @EricCarlson ,
what you are seeing is expected. Your materialized view only refreshes incrementally for new rows. When existing rows are updated, Fabric cannot detect those changes with the current setup, so it performs a full refresh.To make updates refresh incrementally, the view must include Delta’s change-tracking metadata. If you don’t want those fields exposed, use a two-step approach: a staging view with change tracking, and a clean view for reporting.This ensures incremental refresh works for both new and updated rows.
Thank you.
Thanks everyone for your quick responses.
I did change my code to use "saveAsTable" and that might have been the fix. I did not change anything else, nor add any other columns to my select statement, and I was finally able to see an incremental refresh.
One thing I noticed through, is that if there were updated rows in the source data, it always did a full refresh. If there was just new rows added, it would do the incremental refresh.
Is this expected behavior? If so I was wondering if there are any workarounds or different architectures I could consider. I am sourcing data from cosmos which IS doing to be updated, and since I am going to be dealing with 10's millions of rows, but only a few thousand daily adds/updates, I don't want a full refresh on my materalized views. I really just want to 'upsert' the new/changed data into the materialized view.
I thought about just appending all inserts/updates into my source table, so it's always NEW rows, to make the materalized view work. But I have not done this before, I am concerned with an ever growing rate of data in this source table, and then i have to only pull the most recent record for each unique ID out of this source, to put into my materalized view. It sounds complicated, hopefully there is a better way.
Hi @EricCarlson ,
I wanted to check if you had the opportunity to review the information provided and resolve the issue..?Please let us know if you need any further assistance.We are happy to help.
Thank you.
Hi @EricCarlson ,
Thank you for reaching out to Microsoft Fabric Community.
Thank you @shashiPaul1570_ @edwinter for the prompt response
Your Materialized View is always doing a FullRefresh because the source Delta table is not a managed Lakehouse table.Optimal Refresh only works when the source table is created inside the Lakehouse Tables folder.Using .save(dest_table_path) creates an external Delta folder, which forces FullRefresh even if CDF is enabled.Recreate the table using saveAsTable("bronze_users") with CDF enabled so Fabric manages it.Ensure CDF was enabled before the first data load.Recreate the MV and refresh it - you should now see Optimal/Incremental refresh.
Hi ,Your Materialized Lake View is always doing a FullRefresh because one or more required conditions for Optimal Refresh are not satisfied. In Fabric, Optimal Refresh only works when the MV can use Delta Lake CDF metadata, and this requires strict rules.
Your MV query:
does not include the required Delta CDF metadata columns such as:
_commit_version
_commit_timestamp
_change_type
Without these, Fabric cannot calculate incremental changes, so it always falls back to FullRefresh.
Source table is not a native Lakehouse Delta table (e.g., shortcut or external).
MV query includes any unsupported operations (joins, aggregations, filters, subqueries).
Source CDF contains delete records.
Source or MV uses a non-default schema (dbo instead of default).
Modify your MV so it includes the required CDF metadata columns:
This allows the MV engine to detect inserts/updates and use Optimal Refresh instead of FullRefresh.
I am curious about where the documentation is to support these requirements listed above, specifically these additional columns you mention that are required on the source table and in the MLV.
I'm pretty sure the guy is Kudos farming with AI responses. In this instance I think the response has combined references from Databricks with Fabric.
Is this a copilot generated response? As i can not see any references to this requirement in any of the documentation? Additionally most of your response appear to be AI generated with little to no editing so hope you respond to confirm you are not a bot.
@EricCarlson , be aware that sometimes, due to the small size of the source dataset, Fabric might choose a full refresh instead of a incremental refresh, since in that case the performance is better.