Don't miss your chance to take exam DP-600 or DP-700 on us!
Request nowLearn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now
I have a report published in the service that connects to Databricks.
About a month ago we were getting a timeout error, BUT the queries sent from databricks weren't hitting the serverless sql warehouse. We updated the warehouse from serverless to pro, and this seemed to solve the problem for about 3 weeks.. That was until a couple of days ago (the details below are also what we encountered 3 weeks ago before switching from serverless to pro).
In the am (UTC), we have adf pipelines that kick off to refresh out data in databricks, and then refresh Power BI. (Edit: the report refreshes as part of the ADF pipeline, after everything else finishes refreshing, and the report utilizes a different sql wrehouse than the notebooks refreshing in databricks, they utilize a cluster).
But with the latest error, we can see the refresh kick off in Power BI, but it tries for 2 hours before failing. (4 times at about 30 mins each - trying to connect to Databricks).
If I go look at the monitoring of the sql warehouse for the past 24 hours (in which the power bi failure happened), there is nothing that shows up, so the queries aren't even hitting databricks to spin up the sql warehouse.
We had about 3 weeks of successful refreshes via API but then suddely the same errors again, and nothing changed on the power bi side.
But when I try to manually refresh later in the data, the refresh succeeds, and it goes through just fine. Initially we thought that other workspaces within the org were using serverless capacity, and that is why we initially changed from serverless to pro, but now we are seeing the same errors with pro again.
I have tried looking for other solutions out there, but haven't really found much. I'm not sure why in the morning during the scheduled API refresh it won't even hit databricks from power bi, but later in the day it will with a manual refresh.
Edited to add:
SQL Warehouse Details
Hi @sarah_riecke,
Based on your description, it seems the scheduled refresh in Power BI is failing before the query reaches Databricks, which is why there’s no activity in the SQL Warehouse monitoring logs. Since the manual refresh works later in the day, the issue probably happens during the connection initialization from Power BI Service, not within Databricks.
This could be due to timing and overlapping workloads in the morning. When your ADF pipelines run to refresh Databricks and Power BI starts its dataset refresh around the same time, Power BI might have trouble connecting to the SQL Warehouse, especially if the warehouse needs to wake up or if other workloads are running. Power BI will retry several times, which matches the multiple retries and eventual failure you’ve noticed.
It’s also worth checking authentication and connection stability. If the Databricks connection uses a personal access token or stored credentials, there might be occasional authentication or connectivity issues during scheduled refresh. Re-validating credentials in the dataset settings can help rule this out.
Since the warehouse shows activity when you run the refresh manually, introducing a short delay between the ADF pipeline completion and the Power BI refresh could help, giving Databricks time to finalize updates and ensuring the warehouse is ready.
Reviewing the dataset refresh history in Power BI Service may also provide more details about the connection failure, such as activity IDs or error messages.
Overall, since the queries don’t appear in the Databricks logs, the issue is likely on the Power BI Service side during the connection stage, rather than with the warehouse configuration.
Thank you.
The Power BI refresh is part of the ADF pipeline. We do not use the scheduled refresh within Power BI service. It is part of the ADF pipeline, so it refreshes once everything else refreshes, then the pipeline kicks off Power BI refresh off. Power BI isn't starting it's refresh until those are done. So there is no overlap (at least within the same pipeline, I may have to confirm if there is another pipeline that starts to run at the same time).
But also, assuming there is only 1 pipeline, out report refreshes AFTER the data is refreshed in the pipeline, once that completed then it sends to Power BI to refresh. But also, our notebooks that are refreshing are utilizing different clusters than the SQL Warehouse we use for Power BI.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 43 | |
| 38 | |
| 21 | |
| 21 | |
| 17 |