Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
marvelbites06
Frequent Visitor

Fabric Warehouse component Deployment using Azure Repo(DevOps)

I am working on  project for deployment of Warehouse from Fabric worksapce. I also want version control so I am using Azure Repo. I have setup the Azure Repo to deploy all the committed changes but I need to setup the Deployment pipeline in such a way that I should be able to select single table from Warehouse and deploy it to next brach(Test or Prod). I am currently working on YAML code to create the same feature but I am stuck and there was another way suggest by Copilot using Azure Pipelines but still stuck there as well. It will be helpful if anyone can help on this. Power BI deployment Pipelines Deploy entire warehouse irrespecitve of weather it completed or not which can cause issue with the prod tables. This is first time I am doing so it will be helpful.

 

@zhouguyin 

@GuyInACube 

@KratosBI 

 

@suparnababu8 

@burakkaragoz 

@nilendraFabric 

1 ACCEPTED SOLUTION

@marvelbites06 ,

 

Thanks for sharing your YAML and more details! You're on the right track, and it's great to see your pipeline taking shape. Let me help clarify a few points and share some practical steps and sample YAML to get you going, especially around subscription handling and modular deployments.

1. Azure Subscription in YAML:
To use Azure CLI tasks in your pipeline, you need to provide a service connection that has access to the right Azure subscription. In Azure DevOps, this is handled via a “service connection” (not just the subscription ID).

  • In your YAML, instead of manually specifying the subscription ID as a variable, use the azureSubscription input, which references a service connection you create in Azure DevOps (Project Settings > Service connections).
  • Example:
    YAML
     
    - task: AzureCLI@2
      inputs:
        azureSubscription: 'Your-Service-Connection-Name'
        scriptType: 'bash'
        scriptLocation: 'inlineScript'
        inlineScript: |
          # Your scripts here
  • The service connection handles authentication and securely stores credentials.

2. Modular/Single & Multi-Select Deployments:
You can use pipeline variables or parameters to control which tables or components deploy, making your pipeline flexible for single or multi-table deployment.

  • Define a parameter at the top of your YAML:
    YAML
     
    parameters:
    - name: tablesToDeploy
      type: object
      default: ['Table1', 'Table2']
  • Reference this parameter in your deployment scripts to loop through and deploy only the selected tables.

3. Sample Step for Validating Deployment:
Here's how you might add a validation step for your Fabric SQL endpoint: yaml - task: AzureCLI@2 inputs: azureSubscription: 'Your-Service-Connection-Name' scriptType: 'bash' scriptLocation: 'inlineScript' inlineScript: | az login --service-principal -u $(clientId) -p $(clientSecret) --tenant $(tenantId) sqlcmd -S $(sqlEndpoint) -d $(databaseName) -U $(sqlUser) -P $(sqlPass) -Q "SELECT COUNT(*) FROM INFORMATION_SCHEMA.TABLES"

  • Make sure your pipeline variables (like clientId, clientSecret, etc.) are set using Azure DevOps secrets for security.

4. Step-by-Step:

  • Set up an Azure Service Connection in Azure DevOps for your subscription.
  • Reference this connection in your YAML (azureSubscription: ...).
  • Use parameters/variables to control which components get deployed.
  • Add validation steps as needed (as above).
  • For multi-table or environment-based deployment, loop through parameterized table names or environment settings.

If you want, I can share a more complete YAML template that includes parameterized deployment and best practices for secrets/connection handling. Just let me know your exact scenario (e.g., number of environments, single vs. multiple tables, etc.), and I can tailor it for you.

Summary:

  • Use Azure DevOps “service connections” for subscription/auth.
  • Parameterize your pipeline for flexible deployment.
  • Use inline scripts for validation and deployment logic.
  • Secure your secrets with pipeline variables.

Let me know if you’d like a full example YAML or need help with a specific step! You’re making great progress—just a couple of tweaks and you’ll have a robust deployment pipeline.

View solution in original post

7 REPLIES 7
v-prasare
Community Support
Community Support

Hi @marvelbites06,

As we haven’t heard back from you, we wanted to kindly follow up to check if the issue got resolved? or let us know if you need any further assistance here?

 

 

 

 

Thanks,

Prashanth Are

MS Fabric community support

Hi @v-prasare @burakkaragoz 

 

I was not able to getback as my Service Connection was not created and I was not able tomove ahead. Now I have the service connection created and hte Pipeline is validated. It will be really helpful if you can guide me though the paramter creation and how to Map the branches. I am pasting the YAML below. 

 

I have Main Branch and I have ADS_DEV as Testing Branch. I want  to move the data from Test to Prod with Example that I want to Move Sales_Document_Fact to Prod. How will I have to Parameterise it. 

 

Let me kow if we ca connect over teams as well I will be happy to send you the invite. 

 

# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml

trigger:
- main

pool:
  vmImage: 'windows-latest'

variables:
  sqlEndpoint: 'cji6xqkgpgpcxiacaz7r2se-lol6ltv7evohd5tzhxy6e.datawarehouse.fabric.microsoft.com'
  databaseName: 'ADS_GSC_DATAMART_WH'  # Replace with your actual Fabric warehouse name

steps:
- task: AzureCLI@2
  inputs:
    azureSubscription: 'GSC-NONP-->PRO-ADSCLOUD-001-D-SC'  # Replace with your Azure DevOps service connection
    scriptType: 'bash'
    scriptLocation: 'inlineScript'
    inlineScript: |
      echo "Logging in with service principal..."
      az login --service-principal -u $servicePrincipalId -p $servicePrincipalKey --tenant $tenantId

      echo "Testing connection to Fabric SQL endpoint..."
      sqlcmd -S $sqlEndpoint -d $databaseName -G -Q "SELECT TOP 1 * FROM INFORMATION_SCHEMA.TABLES"
  env:
    servicePrincipalId: $(clientId)
    servicePrincipalKey: $(clientSecret)
    tenantId: $(tenantId)
    sqlEndpoint: $(sqlEndpoint)
    databaseName: $(databaseName)
v-prasare
Community Support
Community Support

Hi @marvelbites06,

As we haven’t heard back from you, we wanted to kindly follow up to check if the issue got resolved? or let us know if you need any further assistance here?

 

 

 

 

Thanks,

Prashanth Are

MS Fabric community support

v-prasare
Community Support
Community Support

Hi @marvelbites06,

We would like to follow up to see if the solution provided by the super user resolved your issue. Please let us know if you need any further assistance.

 

@burakkaragoz, thanks for your prompt response.

 

If our super user response resolved your issue, please mark it as "Accept as solution" and click "Yes" if you found it helpful.

marvelbites06
Frequent Visitor

Hi @burakkaragoz , Thank you very much for the response. I have build the pipeline but not able to add azure subscription to it and not sure from where I should take it. I am struggling go ahead of this as of now. Below is the YAML that I have created(Not sure if its correct). It will be greatful if you can share step by step process and YAML examples for single and multiselect deployment. I have looked for similar content on the Web but nothing much available. Below is the YAML;

trigger:
main

pool:
  vmImage'windows-latest'

variables:
  sqlEndpoint'cji6xqkgpgouzptzhxy6e.datawarehouse.fabric.microsoft.com'
  databaseName'SDS_DATAMART_WH'  

steps:
taskAzureCLI@2
  inputs:
    azureSubscription'4357545a-54yu-5kfg-7u58-773d303742ad' 
    scriptType'bash'
    scriptLocation'inlineScript'
    inlineScript: |
      echo "Logging in with service principal..."
      az login --service-principal -u $servicePrincipalId -p $servicePrincipalKey --tenant $tenantId

      echo "Testing connection to Fabric SQL endpoint..."
      sqlcmd -S $sqlEndpoint -d $databaseName -G -Q "SELECT TOP 1 * FROM INFORMATION_SCHEMA.TABLES"
  env:
    servicePrincipalId$(clientId)
    servicePrincipalKey$(clientSecret)
    tenantId$(tenantId)
    sqlEndpoint$(sqlEndpoint)
    databaseName$(databaseName)

@marvelbites06 ,

 

Thanks for sharing your YAML and more details! You're on the right track, and it's great to see your pipeline taking shape. Let me help clarify a few points and share some practical steps and sample YAML to get you going, especially around subscription handling and modular deployments.

1. Azure Subscription in YAML:
To use Azure CLI tasks in your pipeline, you need to provide a service connection that has access to the right Azure subscription. In Azure DevOps, this is handled via a “service connection” (not just the subscription ID).

  • In your YAML, instead of manually specifying the subscription ID as a variable, use the azureSubscription input, which references a service connection you create in Azure DevOps (Project Settings > Service connections).
  • Example:
    YAML
     
    - task: AzureCLI@2
      inputs:
        azureSubscription: 'Your-Service-Connection-Name'
        scriptType: 'bash'
        scriptLocation: 'inlineScript'
        inlineScript: |
          # Your scripts here
  • The service connection handles authentication and securely stores credentials.

2. Modular/Single & Multi-Select Deployments:
You can use pipeline variables or parameters to control which tables or components deploy, making your pipeline flexible for single or multi-table deployment.

  • Define a parameter at the top of your YAML:
    YAML
     
    parameters:
    - name: tablesToDeploy
      type: object
      default: ['Table1', 'Table2']
  • Reference this parameter in your deployment scripts to loop through and deploy only the selected tables.

3. Sample Step for Validating Deployment:
Here's how you might add a validation step for your Fabric SQL endpoint: yaml - task: AzureCLI@2 inputs: azureSubscription: 'Your-Service-Connection-Name' scriptType: 'bash' scriptLocation: 'inlineScript' inlineScript: | az login --service-principal -u $(clientId) -p $(clientSecret) --tenant $(tenantId) sqlcmd -S $(sqlEndpoint) -d $(databaseName) -U $(sqlUser) -P $(sqlPass) -Q "SELECT COUNT(*) FROM INFORMATION_SCHEMA.TABLES"

  • Make sure your pipeline variables (like clientId, clientSecret, etc.) are set using Azure DevOps secrets for security.

4. Step-by-Step:

  • Set up an Azure Service Connection in Azure DevOps for your subscription.
  • Reference this connection in your YAML (azureSubscription: ...).
  • Use parameters/variables to control which components get deployed.
  • Add validation steps as needed (as above).
  • For multi-table or environment-based deployment, loop through parameterized table names or environment settings.

If you want, I can share a more complete YAML template that includes parameterized deployment and best practices for secrets/connection handling. Just let me know your exact scenario (e.g., number of environments, single vs. multiple tables, etc.), and I can tailor it for you.

Summary:

  • Use Azure DevOps “service connections” for subscription/auth.
  • Parameterize your pipeline for flexible deployment.
  • Use inline scripts for validation and deployment logic.
  • Secure your secrets with pipeline variables.

Let me know if you’d like a full example YAML or need help with a specific step! You’re making great progress—just a couple of tweaks and you’ll have a robust deployment pipeline.

burakkaragoz
Community Champion
Community Champion

Hi @marvelbites06 ,

 

Great to see you tackling Fabric Warehouse deployments with Azure Repo and DevOps! We’ve actually been working on similar projects with our team, so I thought I’d share a few insights and practical tips that might help you move forward.

First off, you’re definitely on the right track by using Azure Repos for version control and YAML pipelines for automation. Selecting and deploying a single table from the Warehouse, rather than the entire set, is a common need—especially for controlled releases or targeted testing.

Here’s what we’ve found works well:

  • In your deployment YAML, you can use parameters or variables to specify which table(s) should be included in each pipeline run. For example, passing the table name as a pipeline parameter lets you control deployments dynamically.
  • For environments (Test/Prod), setting up separate branches or environments in your pipeline helps direct deployments precisely where you want.
  • If you want granular control (deploying only completed and validated tables), you can add checks in your pipeline to validate status before deployment, avoiding accidental overwrites in production.
  • Power BI deployment pipelines do tend to push everything, but with Azure DevOps and some custom scripting, you can be as selective as you need.

We’ve tackled a bunch of similar challenges and have built out modular YAML templates for Fabric deployments, with step-by-step controls for single-table or multi-table deployment scenarios. If you want, we can share some sample YAML snippets or even walk you through our approach.

If you run into specific errors or want to share a piece of your YAML, feel free to tag me (@burakkaragoz) or anyone from our group. We’re always happy to brainstorm and troubleshoot together.

Good luck with your setup! I’m sure you’ll have it running smoothly soon, and your approach will help a lot of others in the community as well.

Let us know how it goes or if you need more concrete examples!

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.