Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
andy808
Resolver I
Resolver I

Backend Error when publishing a composite model through the pipeline

I have a pbix that has a direct query and an import source. Publishing direct to Dev from desktop when connection strings are for Dev refreshes fine OR when I publishe direct to Test when strings are set to Test report refreshes just fine. When I publish to Dev and publish via the pipeline from Dev to Test and try to set the new rules for test - I get this error: 

 

Backend ErrorFailed to save modifications to the server. Error returned: '{"error":{"code":"Premium_ASWL_Error","pbi.error":{"code":"Premium_ASWL_Error","parameters":{},"details":[{"code":"Premium_ASWL_Error_Details_Label","detail":{"type":1,"value":"We are unable to retrieve the credentials for the data source '<ccon>{\"address\":{\"kind\":\"SQL\"}}</ccon>'. This error might occur when a request attempts to add a new data source to a semantic model and perform a data refresh in a single transaction, which is not supported. In such cases, add new data sources in a separate transaction first and then separately perform the data refresh."}}],"exceptionCulprit":0}}} '.  

 

Any ideas on what I need to do here to be able to use the pipeline?

1 ACCEPTED SOLUTION

I ended up creating parameters in the pbix for each datasource for each environment. This allowed me to set the parameter in the Settings section after I promoted via pipeline but BEFORE before I applied the new parameters in the pipeline. This allowed me to set the parameters, apply them and then refresh.

View solution in original post

11 REPLIES 11
v-prasare
Community Support
Community Support

If your issue still persists and your blocked on this, please consider raising a support ticket for further assistance.
To raise a support ticket for Fabric and Power BI, kindly follow the steps outlined in the following guide:

How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

I ended up creating parameters in the pbix for each datasource for each environment. This allowed me to set the parameter in the Settings section after I promoted via pipeline but BEFORE before I applied the new parameters in the pipeline. This allowed me to set the parameters, apply them and then refresh.

v-prasare
Community Support
Community Support

Hi @andy808,

We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.


@anilgavhane, @grazitti_sapna & @lbendlin ,Thanks for your prompt response

 

 

Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support

Hi, checking in here to see if there is more quidance. It looks to be my lack of understanding on how to achieve STEP 2 in a GCC Premium Capacity. How do I do this when the data source is greyed out/not manually able to update: 

  • Backend Limitation: Power BI Premium does not support adding a new data source and refreshing the semantic model in the same transaction. This triggers the Premium_ASWL_Error.

How to Fix It

1. Split the Deployment into Two Steps

  • Step 1: Deploy the model from Dev to Test using the pipeline without triggering a refresh.
  • Step 2: After deployment, manually update the data source credentials in Test.
  • Step 3: Perform a separate refresh after the credentials are saved.

andy808_0-1759936009353.pngandy808_1-1759936129105.png

 

I still need support. I am on GCC, Premium Capacity. I do not see anywhere in the settings or in the pipeline that allows me to switch the datasources manually (from Dev to Test). In the Gateway Connection settings - "the datasource connected to this model"is greyed out. I do not want to publish direct to Test/Prod - I want to promote through the pipeline. This is only happening with a model that has import and Direct Query tables.  I do not have any refresh turned on and I have the credentials/gateway mapping all set. I get this error message: 

  • Backend ErrorFailed to save modifications to the server. Error returned: '{"error":{"code":"Premium_ASWL_Error","pbi.error":{"code":"Premium_ASWL_Error","parameters":{},"details":[{"code":"Premium_ASWL_Error_Details_Label","detail":{"type":1,"value":"We are unable to retrieve the credentials for the data source '<ccon>{\"address\":{\"kind\":\"SQL\"}}</ccon>'. This error might occur when a request attempts to add a new data source to a semantic model and perform a data refresh in a single transaction, which is not supported. In such cases, add new data sources in a separate transaction first and then separately perform the data refresh."}}],"exceptionCulprit":0}}} '.
anilgavhane
Responsive Resident
Responsive Resident

@andy808 

  • Composite Model Behavior
      : Your PBIX file includes both DirectQuery and Import sources. When publishing directly to Dev or Test with matching connection strings, everything works fine.
    • Pipeline Deployment Issue: When you publish to Dev and then deploy to Test via the pipeline, Power BI tries to apply new data source rules and refresh the dataset in one go.
    • Backend Limitation: Power BI Premium does not support adding a new data source and refreshing the semantic model in the same transaction. This triggers the Premium_ASWL_Error.

     

    How to Fix It

    1. Split the Deployment into Two Steps

    • Step 1: Deploy the model from Dev to Test using the pipeline without triggering a refresh.
    • Step 2: After deployment, manually update the data source credentials in Test.
    • Step 3: Perform a separate refresh after the credentials are saved.

    This avoids the unsupported “add + refresh” combo.

    2. Preconfigure Data Source Rules

    • Before deploying, ensure that the data source rules for Test are already defined and valid.
    • This includes:
    • Correct connection strings
    • Valid credentials
    • Gateway mappings (if applicable)

    3. Use Parameters for Environment Switching

    • Instead of hardcoding connection strings, use parameters in your model to switch environments.
    • This allows the pipeline to apply rules without modifying the underlying data source directly.

    4. Check Gateway Configuration

    • If your model uses an on-premises SQL source, ensure the gateway is properly configured in both Dev and Test.
    • Missing or misconfigured gateways can also trigger credential errors

 

@anilgavhane  #4 is done, I have gateway connections with credentials already done, set for mapping test data source. It is step #1 where I am having an issue. I do not see anywhere where I can: 

  • Step 1: Deploy the model from Dev to Test using the pipeline without triggering a refresh. How do I do this?
  • Step 2: After deployment, manually update the data source credentials in Test. How?
andy808
Resolver I
Resolver I

Thank you both for your responses. I do not have a scheduled refresh enabled. For step 3 - I do not know how/where to do this? In the pipeline I set the rule to switch the data source from Dev to Test, if i do not do that then the report that was published to Test via the pipeline still will be pointing to the Dev data source and I do not see a way to edit the datasource nor update credentials. The image below is all greyed out - no option to change it:

andy808_0-1759511465791.png

We use data gateway connections that are already mapped to the enironment, already with credentials.

 

Hi @andy808 

Try running the deployment rules one by one.
Please let us know if you still face issues.

grazitti_sapna
Super User
Super User

Hi @andy808 

 

To resolve thi syou can try these follwing steps and let us know if the issue still persists.

  1. Publish to Test using the pipeline as you normally do.
  2. After deployment, go to the dataset settings in the Test workspace.
  3. Edit the data source credentials manually for all sources (DirectQuery and Import), even if you set up rules.
  4. Save the credentials.
  5. Only after credentials are set/re-confirmed, attempt the refresh or allow the pipeline to proceed with subsequent tasks.

This extra manual step is necessary because the service cannot update data source credentials and perform a refresh in one transaction on Premium capacity, especially for composite models. The best practice is always to configure/confirm credentials immediately after deployment, then trigger a refresh.

 

🌟 I hope this solution helps you unlock your Power BI potential! If you found it helpful, click 'Mark as Solution' to guide others toward the answers they need.

💡 Love the effort? Drop the kudos! Your appreciation fuels community spirit and innovation.

🎖 As a proud SuperUser and Microsoft Partner, we’re here to empower your data journey and the Power BI Community at large.

🔗 Curious to explore more? [Discover here].

Let’s keep building smarter solutions together!

lbendlin
Super User
Super User

Just a guess - do you have a refresh schedule enabled for the semantic model in the Test workspace?

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.