Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
In Azure Data Factory, I have a pipeline defined with Bronze - Silver - Gold layers + the final "step 4" entails the refresh of a Power BI semantic model. This final step is executed via a Databricks notebook and contains the following tasks:
# getting parameters: environment | workspaceID | datasetID
# init client - based on # getting service credentials from Unity Catalog and # getting Azure Key Vault url
# Get access token based on:
- f"https://login.microsoftonline.com/{tenant_id}/oauth2/token"
- client credentials
# Call Power BI Dataset Refresh API
Executing this final step / notebook works fine - works as expected.
The ADF monitoring related to this pipeline shows on a daily basis all the execution results of the "master pipeline" (which includes the step 4) during the last days.
But... the PBI refresh history does not show the refresh results of the last days. For some reason, the PBI refresh history results are available on a daily basis but the results related to the last six days are missing.
Any suggestions? What might be the reason of this "failure"?
Thank you
Solved! Go to Solution.
Hi @gaston_clynhens,
I'm glad to hear that you found a solution and resolved the query. Thank you for sharing it here!
This clearly explains the mismatch between the ADF monitor logs and Power BI refresh history.
The error message showed that the Power BI dataset was not found because the ADF-Global-Parameters config file was overwritten by an older feature branch in Azure DevOps that had an outdated datasetId. As a result, the Databricks notebook called the refresh API with an invalid ID, so Power BI did not record the refresh even though ADF indicated the run was successful.
Once the correct workspaceId and datasetId were restored, the refresh API functioned as expected and the issue was resolved.
Please mark the useful response as the accepted solution to help others in the community find it easily.
Thank you for being a part of the Microsoft Community Forum!
Hi @gaston_clynhens,
This behavior is expected because the Power BI refresh history view and Azure Data Factory (ADF) monitor logs operate at different execution layers.
In ADF, the pipeline run including your Databricks step 4 will always be visible in the ADF monitor once triggered. However, Power BI’s Refresh History only logs dataset refreshes that the Power BI service itself registers, and this view is limited.It keeps only the most recent 20–60 refresh records in UTC time, with older or frequent refreshes automatically removed.
Therefore, even if ADF shows successful runs, those may not appear in Power BI because:
The refresh history window has cycled out older entries.
Time zones differ (UTC vs local time).
The API call may have referenced a different dataset or workspace ID.
To verify, you can:
Check the workspaceId and datasetId used in your Databricks notebook.
Use the REST endpoint GET /groups/{workspaceId}/datasets/{datasetId}/refreshes to see the current stored history.
Match these with ADF monitor times (converted to UTC) for alignment.
For long-term auditing, consider logging each refresh response (workspace ID, dataset ID, timestamp, run status) to storage within your pipeline, as Power BI’s history is not designed for permanent records.
Datasets - Get Refresh History - REST API (Power BI Power BI REST APIs) | Microsoft Learn
Access the Power BI activity log - Power BI | Microsoft Learn
Troubleshoot refresh scenarios - Power BI | Microsoft Learn
Thank you.
Thank you very much for your feedback.
What happened?
Considering the logging of the executed notebook:
Failed to refresh dataset: {"error":{"code":"ItemNotFound","message":"Dataset \"xxxxxxxxxxxxxxxx\" is not found!
Please verify datasetId is correct and user have sufficient permissions."}}
The dataset was not found
Reason: Azure DevOps repo - feature branch issue
overwriting the contents of the repo - config file "ADF-Global-Parameters" (including this PBI datasetID) - based on an older feature branche
CASE CLOSED
Hi @gaston_clynhens,
I'm glad to hear that you found a solution and resolved the query. Thank you for sharing it here!
This clearly explains the mismatch between the ADF monitor logs and Power BI refresh history.
The error message showed that the Power BI dataset was not found because the ADF-Global-Parameters config file was overwritten by an older feature branch in Azure DevOps that had an outdated datasetId. As a result, the Databricks notebook called the refresh API with an invalid ID, so Power BI did not record the refresh even though ADF indicated the run was successful.
Once the correct workspaceId and datasetId were restored, the refresh API functioned as expected and the issue was resolved.
Please mark the useful response as the accepted solution to help others in the community find it easily.
Thank you for being a part of the Microsoft Community Forum!
In your code you only request a semantic model refresh. This request may or may not be honored. The refresh may start after an arbitrary delay. The refresh may or may not succeed.
Implement your own status polling or use Fabric data pipelines that do the polling for you.