This time we’re going bigger than ever. Fabric, Power BI, SQL, AI and more. We're covering it all. You won't want to miss it.
Learn moreDid you hear? There's a new SQL AI Developer certification (DP-800). Start preparing now and be one of the first to get certified. Register now
Due to Direct Lake models not currently supporting auto-binding to the lakehouse via deployment pipelines (we are not using Azure DevOps/Git), we switched to a Direct Query model. Now it seems I've run into another blocker but am hoping it's something I'm doing or there is another way.
In the model, I setup the 2 parameters - one for SQL Server and one for Database - the SQL endpoint servers are different per DEV, QA, PROD. The database name is the same across all environments.
In the service, I have connections setup that use a service principal (not SSO) - one for each environment.
I deploy the Direct Query model to DEV, then deploy to QA. At this time I check in the workspace and the model is still pointed to the DEV lakehouse, to be expected. Also, in the model settings, the Cloud connection shows the ~sql_dev connection I had setup. Also to be expected since I have not setup the parameter yet. Then, I add the deployment (parameter) rule to use the QA SQL endpoint server, then re-deploy to QA.
On this second re-deploy I get an error:
"Can't complete the deployment - Backend Error We were unable to retrieve the credentials for the data source..... This error occur when a request attempts to add a new data source to a semantic model and perform a data refresh in a single transaction, which is not supported. In such cases, add new data sources in a separate transaction first and then separately perform the data refresh".
From research on this, it sounds like Fabric deployment pipelines cannot change the SQL analytics endpoint server hostnames for Direct Query models.
Does anyone have any insight on how to make this work or what I may be doing wrong here?
Thanks!
Solved! Go to Solution.
Hi @mestee ,
Thank you for contacting the Fabric Community. If the SQL server (endpoint) changes in a DirectQuery model, Power BI recognizes it as a new data source. Deployment pipelines do not support creating and validating new data sources during deployment, which causes this error.
To resolve this, you can either use the same SQL endpoint across all environments or update and rebind the data source after deployment.
Helpful Reference: Create deployment rules for Fabric's ALM - Microsoft Fabric | Microsoft Learn
I hope this clarifies the situation. If I’ve misunderstood any part of your situation, please let us know.
Hello,
Thanks for the quick reply.
(1) We do not use the same SQL endpoint across all environments, so unfortunately, that isn't a resolution for us.
(2) I am not able to rebind the data source after deployment. The direct DirectQuery model uses previously established Service Principal connections (1 for each stage). After deploying the model from DEV to QA, in the QA model Cloud connections drop-down has options 'Default: Single Sign-On (Entra ID)' and the ~sql_dev connection that was previously established and is being carried over from DEV (and there is 'Create a connection').
That all said, I will mark your reply as Accept as Solution and if you have any other comments please feel free to reply for future readers.
As a side note, we were only trying DirectQuery since DirectLake isn't fully developed for our needs at this time. However, the performance I saw with DirectQuery (on a very small model/report) was such that it wasn't going to suit our needs anyways. So back to import we go for now....
Thanks for sharing. Please stay engaged with the community for future discussions.
Hi @mestee ,
Thank you for contacting the Fabric Community. If the SQL server (endpoint) changes in a DirectQuery model, Power BI recognizes it as a new data source. Deployment pipelines do not support creating and validating new data sources during deployment, which causes this error.
To resolve this, you can either use the same SQL endpoint across all environments or update and rebind the data source after deployment.
Helpful Reference: Create deployment rules for Fabric's ALM - Microsoft Fabric | Microsoft Learn
I hope this clarifies the situation. If I’ve misunderstood any part of your situation, please let us know.
Check out the April 2026 Fabric update to learn about new features.
Sign up to receive a private message when registration opens and key events begin.
| User | Count |
|---|---|
| 15 | |
| 12 | |
| 6 | |
| 6 | |
| 5 |
| User | Count |
|---|---|
| 38 | |
| 21 | |
| 14 | |
| 12 | |
| 11 |