Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hello! Today for the second time (first one was yesterday) I changed stuff in my lakehouse (updated rows and added columns) only to find the newly refreshed model to still represent the old state. The SQL endpoint, from which the model gets its data, consequently also only showed the old lakehouse state. I added a 15 minute wait into my pipeline between the last lakehouse edit and the model refresh, to no avail. The lag of the SQL endpoint seems to be some single digit amount of hours. This is completely inacceptable for any practical purposes.
So my questions are: Is this a known issue? Can I improve it? Can I, for example, manually trigger some kind of SQL endpoint refresh? Let me add that the insidious part of this that it also throws no error, so you only notice something is wrong, when it is kind of too late.
Thanks for any help!
Solved! Go to Solution.
Hi @Anonymous ,
I would like to know you what is your method of importing data to lakehouse and changing data? Is it through Pipeline?
I see that some people are having the same problem as you:
Solved: SQL Endpoint Slow To Reflect Changes In Lakehouse - Microsoft Fabric Community
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
How do you add columns?
Have you done other changes to the columns in the table?
As an example, enabling the Lakehouse table's column name mapping mode in a Notebook can cause the SQL Analytics Endpoint to stop syncing properly with the Lakehouse table.
I found this blog post that triggers the refresh of a lakehouse with a Python script. It works like a charm for me. I had the problem that the lakehouse had a lag of 30+ minutes. This Python script (which I used in a notebook) reduced it to 6 minutes and it waits until the sql endpoint is refreshed. So you are sure that other activities which are executed on completion of the Python notebook will have the refreshed sql endpoint.
https://www.obvience.com/blog/fix-sql-analytics-endpoint-sync-issues-in-microsoft-fabric-data-not-sh...
If I understand correctly, this could be a known issue and you could possibly try the workaround mentioned here: https://community.fabric.microsoft.com/t5/Data-Engineering/SQL-Endpoint-Slow-To-Reflect-Changes-In-L...
How do you add columns?
Have you done other changes to the columns in the table?
As an example, enabling the Lakehouse table's column name mapping mode in a Notebook can cause the SQL Analytics Endpoint to stop syncing properly with the Lakehouse table.
Hi @Anonymous ,
I would like to know you what is your method of importing data to lakehouse and changing data? Is it through Pipeline?
I see that some people are having the same problem as you:
Solved: SQL Endpoint Slow To Reflect Changes In Lakehouse - Microsoft Fabric Community
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.