Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi everyone,
I’m working on a project in Fabric that involves periodically updating data in the Data Lake via a notebook, followed by a complete refresh of semantic models to ensure that Power BI reports reflect the latest data.
The first semantic model in the sequence does not update correctly. While the task appears to complete successfully, the model doesn’t reflect the new data from the Data Lake.
To address this temporarily, I had to add a second refresh task for the same model, queuing it as shown in the attached image. This workaround ensures that the model updates correctly on the second execution.
Has anyone encountered a similar issue? Could it be that the semantic model refresh task starts before the updated data is fully available in the Data Lake? If so, what strategies would you recommend to handle this situation without duplicating the task?
Thanks in advance for your support!
Solved! Go to Solution.
Hi @red_lotus85 @it's likely that the lakehouse Sal endpoint metadata has not been refreshed by the time the semantic model refresh task is running.
there is a manual way of triggering the refresh which you could add into your pipeline
Thank you so much for your insight and for sharing the solution! 🙏
It turns out you were absolutely right—the issue was indeed related to the lakehouse endpoint metadata not being refreshed in time. I followed the approach suggested in the video you shared, and it worked perfectly!
I'm really grateful for your support; this has resolved my issue entirely. Thanks again for taking the time to help!
Hi @red_lotus85 ,
Thanks for the reply from AndyDDC , he has a point. Other than that, I have some other suggestions:
You can introduce a wait activity after the refresh semantic model task in the pipeline. This can provide enough time for the data lake to completely update the data before the refresh starts.
Use the polling mechanism to check that the data update task has completed successfully before starting the semantic model refresh task.
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @red_lotus85 @it's likely that the lakehouse Sal endpoint metadata has not been refreshed by the time the semantic model refresh task is running.
there is a manual way of triggering the refresh which you could add into your pipeline
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
| User | Count |
|---|---|
| 3 | |
| 2 | |
| 2 | |
| 1 | |
| 1 |