Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Concerning the issue where the activity appears to have succeeded, but the update has not been appli

We are currently experiencing an issue where the activity is executed and marked as successful in the pipeline, but the internal processing has not caught up or the update has not been applied.
We suspect that this may be primarily due to capacity-related factors. However, it feels inconsistent that the activity is displayed as successful despite the update not being reflected.

Would it be possible to implement a feature that verifies whether the update has been properly applied or whether the activity has truly completed as intended?
Alternatively, would it be possible to display this situation as an error?

Status: New
Comments
TakeakiTADA
Regular Visitor

I am facing the same problem.

The notebook called from the pipeline treats the processing as having finished successfully before it has finished completely.

As a result, if an action to update a semantic model that references a lake house table updated in a notebook is set as the next process in the pipeline, the data before the update will be referenced and the intended update will not be made.

 

So I think it would be good to have one of the following fixes.

  • Make it possible to set a timeout period for notebook execution.
  • Prepare a trigger to detect notebook updates, assuming that notebooks are executed asynchronously.