[Live Webinar] Multiplayer IaC: Solving State, People, and Process-Level IaC Challenges

Register Now ➡️

Azure

Terraform with Azure DevOps CI/CD Pipelines – Tutorial

terraform azure devops

In this article, we will look at how to run Terraform in an Azure DevOps pipeline step by step. We will start with the process, showing how to create an Azure DevOps instance and project, set up Terraform in Azure DevOps, and create Terraform configuration files for the infrastructure and pipelines using YAML, sharing examples and best practices along the way.

  1. Prerequisites
  2. How to run Terraform in an Azure DevOps pipeline
  3. Best practices for Terraform in Azure DevOps pipelines

TL;DR

Terraform works well with Azure DevOps for provisioning Azure infrastructure through repeatable CI/CD pipelines.

 

In a production-ready setup, you should store Terraform state remotely in Azure Blob Storage, run terraform plan on pull requests, require approval before terraform apply, and separate deployments by environment.

 

For better security and maintainability, avoid long-lived secrets where possible, pin Terraform and provider versions, and use a pipeline design that makes plans, approvals, and applies easy to review.

Prerequisites

Before using the example below, create the following in Azure DevOps and Azure:

  1. An Azure Resource Manager service connection named sc-terraform-prod that uses Workload identity federation.
  2. An Azure Storage Account and Blob container for Terraform state. In the example below, the container name is tfstate.
  3. An Azure DevOps environment named prod, with an Approval check configured on that environment.
  4. Role assignments for the service connection identity:
    • Contributor on the target subscription or resource group
    • Storage Blob Data Contributor on the state storage account/container

Azure DevOps can create Azure Resource Manager service connections using workload identity federation, and Azure Pipelines environments support manual approval checks for production deployments. 

For the azurerm backend using Azure AD/OIDC authentication, the documentation recommends Storage Blob Data Contributor on the storage account container as the least-privilege data-plane role for Terraform state access. 

How to run Terraform in an Azure DevOps pipeline

Follow these steps to run Terraform in an Azure DevOps pipeline:

  1. Set up an Azure DevOps project
  2. Create Terraform configuration files
  3. Define CI/CD steps in YAML
  4. Configure the Azure DevOps pipeline
  5. Run the pipeline

1. Set up an Azure DevOps project

With your account set up, you next need to sign into Azure DevOps and create a project where you’ll be setting up your CI/CD pipelines.

  1. Go to the Azure DevOps website and sign in with your Microsoft account or organizational account.
  2. Navigate to Organization: If you’re part of multiple organizations, select the organization where you want to create the project from the top right corner of the page.
  3. Click on the “New Project” button.
  4. In the “Create a new project” form, fill in the Project Name and visibility (whether the project should be public or private) and, optionally, provide a description for your project, then click on the “Create” button.
  5. Once the project is created, you’ll be redirected to the project’s dashboard. Depending on your requirements, you may want to customize your project further by adding team members, configuring permissions, setting up repositories, boards, and pipelines, and integrating with other services.

2. Create Terraform configuration files

Create a directory called infra in your repository and add the following files.

File structure

.
├── azure-pipelines.yml
└── infra
   ├── versions.tf
   ├── providers.tf
   ├── variables.tf
   ├── main.tf
   ├── outputs.tf
   └── prod.tfvars

versions.tf

terraform {
 required_version = ">= 1.8.0, < 2.0.0"

 required_providers {
   azurerm = {
     source  = "hashicorp/azurerm"
     version = "~> 4.0"
   }
 }

 backend "azurerm" {
   use_oidc             = true
   use_azuread_auth     = true
   storage_account_name = "REPLACE_WITH_STATE_STORAGE_ACCOUNT"
   container_name       = "tfstate"
   key                  = "terraform-azure-devops/prod.terraform.tfstate"
 }
}

providers.tf

provider "azurerm" {
 features {}
}

variables.tf

variable "location" {
 description = "Azure region for all resources."
 type        = string
}

variable "resource_group_name" {
 description = "Name of the resource group."
 type        = string
}

variable "vnet_name" {
 description = "Name of the virtual network."
 type        = string
}

variable "vnet_address_space" {
 description = "Address space for the virtual network."
 type        = list(string)
}

variable "subnet_name" {
 description = "Name of the subnet."
 type        = string
}

variable "subnet_prefixes" {
 description = "Address prefixes for the subnet."
 type        = list(string)
}

variable "tags" {
 description = "Tags applied to all resources."
 type        = map(string)
 default     = {}
}

main.tf

resource "azurerm_resource_group" "this" {
 name     = var.resource_group_name
 location = var.location
 tags     = var.tags
}

resource "azurerm_virtual_network" "this" {
 name                = var.vnet_name
 location            = azurerm_resource_group.this.location
 resource_group_name = azurerm_resource_group.this.name
 address_space       = var.vnet_address_space
 tags                = var.tags
}

resource "azurerm_subnet" "this" {
 name                 = var.subnet_name
 resource_group_name  = azurerm_resource_group.this.name
 virtual_network_name = azurerm_virtual_network.this.name
 address_prefixes     = var.subnet_prefixes
}

outputs.tf

output "resource_group_name" {
 value = azurerm_resource_group.this.name
}

output "virtual_network_name" {
 value = azurerm_virtual_network.this.name
}

output "subnet_name" {
 value = azurerm_subnet.this.name
}

prod.tfvars

location            = "westeurope"
resource_group_name = "rg-terraform-azure-devops-prod"
vnet_name           = "vnet-terraform-azure-devops-prod"
vnet_address_space  = ["10.20.0.0/16"]
subnet_name         = "snet-app"
subnet_prefixes     = ["10.20.1.0/24"]

tags = {
 environment = "prod"
 managed_by  = "terraform"
 repo        = "terraform-azure-devops"
}

After your first local terraform init, commit the generated .terraform.lock.hcl file as well. It’s also recommended to commit the lock file so future runs install the same provider versions by default. 

3. Define CI/CD steps in YAML

Create the following azure-pipelines.yml file in the root of the repository.

This pipeline validates Terraform on pull requests and pushes to main, generates a saved plan, publishes that plan as an artifact, and only applies changes after a run reaches the prod environment. The artifact step matters because Microsoft-hosted agents get a fresh machine for each job. Approval checks also live on the environment, not in YAML.

azure-pipelines.yml

trigger:
 branches:
   include:
     - main

pr:
 branches:
   include:
     - main

pool:
 vmImage: 'ubuntu-latest'

variables:
 terraformWorkingDirectory: 'infra'
 terraformVersion: '1.10.5'
 azureServiceConnection: 'sc-terraform-prod'
 environmentName: 'prod'

stages:
 - stage: Validate
   displayName: 'Validate Terraform'
   jobs:
     - job: Validate
       displayName: 'Run terraform fmt and validate'
       steps:
         - checkout: self

         - task: TerraformInstaller@1
           displayName: 'Install Terraform'
           inputs:
             terraformVersion: '$(terraformVersion)'

         - task: AzureCLI@2
           displayName: 'Terraform fmt and validate'
           inputs:
             azureSubscription: '$(azureServiceConnection)'
             addSpnToEnvironment: true
             scriptType: 'bash'
             scriptLocation: 'inlineScript'
             inlineScript: |
               set -euo pipefail

               export TF_IN_AUTOMATION=true
               export ARM_USE_OIDC=true
               export ARM_USE_AZUREAD=true
               export ARM_OIDC_TOKEN="$idToken"
               export ARM_CLIENT_ID="$servicePrincipalId"
               export ARM_TENANT_ID="$tenantId"
               export ARM_SUBSCRIPTION_ID="$(az account show --query id -o tsv)"
               export ARM_OIDC_AZURE_SERVICE_CONNECTION_ID="$AZURESUBSCRIPTION_SERVICE_CONNECTION_ID"

               cd "$(terraformWorkingDirectory)"

               terraform fmt -check -recursive
               terraform init -input=false
               terraform validate

 - stage: Plan
   displayName: 'Plan Terraform'
   dependsOn: Validate
   jobs:
     - job: Plan
       displayName: 'Create Terraform plan'
       steps:
         - checkout: self

         - task: TerraformInstaller@1
           displayName: 'Install Terraform'
           inputs:
             terraformVersion: '$(terraformVersion)'

         - task: AzureCLI@2
           displayName: 'Terraform plan'
           inputs:
             azureSubscription: '$(azureServiceConnection)'
             addSpnToEnvironment: true
             scriptType: 'bash'
             scriptLocation: 'inlineScript'
             inlineScript: |
               set -euo pipefail

               export TF_IN_AUTOMATION=true
               export ARM_USE_OIDC=true
               export ARM_USE_AZUREAD=true
               export ARM_OIDC_TOKEN="$idToken"
               export ARM_CLIENT_ID="$servicePrincipalId"
               export ARM_TENANT_ID="$tenantId"
               export ARM_SUBSCRIPTION_ID="$(az account show --query id -o tsv)"
               export ARM_OIDC_AZURE_SERVICE_CONNECTION_ID="$AZURESUBSCRIPTION_SERVICE_CONNECTION_ID"

               cd "$(terraformWorkingDirectory)"

               terraform init -input=false
               terraform plan -input=false -lock-timeout=300s -out=tfplan -var-file=prod.tfvars
               terraform show -no-color tfplan > tfplan.txt

               mkdir -p "$(Build.ArtifactStagingDirectory)/terraform-plan"
               cp tfplan "$(Build.ArtifactStagingDirectory)/terraform-plan/tfplan"
               cp tfplan.txt "$(Build.ArtifactStagingDirectory)/terraform-plan/tfplan.txt"

         - publish: '$(Build.ArtifactStagingDirectory)/terraform-plan'
           artifact: 'terraform-plan'
           displayName: 'Publish Terraform plan artifact'

 - stage: Apply
   displayName: 'Apply Terraform'
   dependsOn: Plan
   condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
   jobs:
     - deployment: Apply
       displayName: 'Apply Terraform to production'
       environment: '$(environmentName)'
       strategy:
         runOnce:
           deploy:
             steps:
               - checkout: self

               - task: DownloadPipelineArtifact@2
                 displayName: 'Download Terraform plan artifact'
                 inputs:
                   buildType: 'current'
                   artifactName: 'terraform-plan'
                   targetPath: '$(Pipeline.Workspace)/terraform-plan'

               - task: TerraformInstaller@1
                 displayName: 'Install Terraform'
                 inputs:
                   terraformVersion: '$(terraformVersion)'

               - task: AzureCLI@2
                 displayName: 'Terraform apply'
                 inputs:
                   azureSubscription: '$(azureServiceConnection)'
                   addSpnToEnvironment: true
                   scriptType: 'bash'
                   scriptLocation: 'inlineScript'
                   inlineScript: |
                     set -euo pipefail

                     export TF_IN_AUTOMATION=true
                     export ARM_USE_OIDC=true
                     export ARM_USE_AZUREAD=true
                     export ARM_OIDC_TOKEN="$idToken"
                     export ARM_CLIENT_ID="$servicePrincipalId"
                     export ARM_TENANT_ID="$tenantId"
                     export ARM_SUBSCRIPTION_ID="$(az account show --query id -o tsv)"
                     export ARM_OIDC_AZURE_SERVICE_CONNECTION_ID="$AZURESUBSCRIPTION_SERVICE_CONNECTION_ID"

                     cd "$(terraformWorkingDirectory)"

                     terraform init -input=false
                     terraform apply -input=false -lock-timeout=300s "$(Pipeline.Workspace)/terraform-plan/tfplan"

4. Configure the Azure DevOps pipeline

After committing the files above, configure the pipeline in Azure DevOps as follows:

  1. Push the repository to Azure Repos or connect the repository from GitHub.
  2. In Azure DevOps, go to Pipelines and create a new pipeline that uses the existing azure-pipelines.yml file.
  3. Make sure the Azure Resource Manager service connection is named sc-terraform-prod, or update the YAML variable to match your service connection name.
  4. Create an environment named prod under Pipelines → Environments.
  5. Add an Approval check to the prod environment.
  6. Open a pull request to confirm the Validate and Plan stages run.
  7. Merge the pull request into main to trigger the Apply stage after approval.

Azure DevOps approvals and checks are managed on the environment or other protected resources in the web UI, not in the YAML file itself. 

5. Run the pipeline

Once the repository contains the Terraform files and the azure-pipelines.yml definition, you can run the workflow through Azure DevOps.

Start by opening a pull request against the main branch. This triggers the Validate and Plan stages of the pipeline. During this run, Azure DevOps installs Terraform, initializes the backend, validates the configuration, and generates a saved execution plan. The plan output is then published as a pipeline artifact so it can be reviewed and reused later in the deployment process.

After reviewing the proposed changes, merge the pull request into main. This triggers the Apply stage, but the deployment does not start immediately if the target Azure DevOps environment has an approval check configured. Instead, the run waits for approval before Terraform applies the saved plan.

This flow gives teams a safer delivery process: infrastructure changes are validated automatically, planned before execution, and only applied after review and approval.

When the pipeline runs, verify that:

  • terraform fmt -check and terraform validate complete successfully
  • the plan artifact is published in the Plan stage
  • the apply stage waits for approval on the prod environment
  • Terraform applies the same saved plan that was generated earlier

Download The Practitioner’s Guide to Scaling Infrastructure as Code

cheatsheet_image

Best practices for Terraform in Azure DevOps pipelines

Here are ten best practices you’ll want to think about when writing your Terraform pipelines in Azure DevOps:

  1. Define your Azure DevOps pipelines using YAML as code (e.g., azure-pipelines.yml) alongside your Terraform configurations. This practice enables versioning, review, and audit of pipeline changes. Store your Terraform configurations in version control (e.g., Git) to track changes, collaborate with team members, and maintain a history of infrastructure modifications.
  2. Configure pipeline triggers to automatically execute pipelines upon code changes (e.g., commits to specific branches or pull requests, or on an overnight schedule to run a plan). This ensures timely validation and deployment of infrastructure changes.
  3. Optimize pipeline performance by parallelizing tasks whenever possible. For example, run Terraform plan and apply stages concurrently for different environments or resource groups.
  4. Maintain separate pipelines and environments (e.g., Dev, QA, Prod) to isolate infrastructure changes and avoid unintended impacts across environments. These can be set up in the Environments section of Azure DevOps.
  5. Integrate automated testing into your pipelines for Terraform configurations using tools like Terratest to validate infrastructure changes before deployment.
  6. Implement robust error handling and logging mechanisms within pipelines to capture and report errors effectively. Utilize Azure DevOps pipeline features like logging commands and error-handling tasks.
  7. Include validation steps in your pipelines to verify Terraform configurations against best practices, policies, and compliance requirements using tools like terraform validate to confirm your syntax is correct, terraform fmt to validate your formatting or call static analysis tools.
  8. Monitor pipeline execution, performance metrics, and resource usage to identify bottlenecks, optimize workflows, and ensure the reliability of infrastructure deployments.
  9. Apply security best practices to your pipelines, including access control, role-based permissions, and regular security reviews to mitigate risks associated with pipeline configurations and execution.
  10. Document pipeline configurations, deployment processes, and infrastructure designs to facilitate collaboration, knowledge sharing, and onboarding of team members.

Deploying Terraform resources with Spacelift

Terraform is really powerful, but to achieve an end-to-end secure GitOps approach, you need to use a product that can run your Terraform workflows. Spacelift takes managing Terraform to the next level by giving you access to a powerful CI/CD workflow and unlocking features such as:

  • Policies (based on Open Policy Agent) – You can control how many approvals you need for runs, what kind of resources you can create, and what kind of parameters these resources can have, and you can also control the behavior when a pull request is open or merged.
  • Multi-IaC workflows – Combine Terraform with Kubernetes, Ansible, and other infrastructure-as-code (IaC) tools such as OpenTofu, Pulumi, and CloudFormation,  create dependencies among them, and share outputs
  • Build self-service infrastructure – You can use Blueprints to build self-service infrastructure; simply complete a form to provision infrastructure based on Terraform and other supported tools.
  • Integrations with any third-party tools – You can integrate with your favorite third-party tools and even build policies for them. For example, see how to integrate security tools in your workflows using Custom Inputs.

Spacelift enables you to create private workers inside your infrastructure, which helps you execute Spacelift-related workflows on your end. Read the documentation for more information on configuring private workers.

You can check it for free by creating a trial account or booking a demo with one of our engineers.

Key points

Terraform tasks in the Terraform workflow, including deployments, can be automated with Azure DevOps CI/CD Pipelines. Azure DevOps is a good choice for seamless integration with Azure and the Azure ecosystem in particular. Using pipelines lets you run the workflow tasks defined in your pipeline in a repeatable, controlled manner, ensuring consistency and reducing manual effort.

Achieve Terraform at scale with Spacelift

Spacelift takes managing infrastructure at scale to a whole new level, offering a more open, more customizable, and more extensible product. It’s a better, more flexible CI/CD for Terraform, offering maximum security without sacrificing functionality.

Learn more

Frequently asked questions

  • Can Terraform be used with Azure?

    Yes, Terraform can be used with Azure by using the azurerm provider. It allows you to define and manage Azure resources like VMs, networks, and storage accounts using infrastructure as code. Terraform also supports remote state in Azure Blob Storage and integrates with Azure AD for authentication.

  • How do you run Terraform in Azure DevOps?

    The usual setup is to store your Terraform code in an Azure DevOps repository, then use a pipeline to run terraform init, terraform validate, terraform plan, and terraform apply. In most teams, plan runs automatically for pull requests, while apply is restricted to approved changes.

  • How do you authenticate Terraform in Azure DevOps securely?

    Azure DevOps needs an Azure connection with permissions to create and update resources. For security, teams should avoid hardcoded credentials and use a more secure authentication method that reduces reliance on long-lived secrets wherever possible.

  • How do you store Terraform state in Azure Blob?

    Terraform state should be stored remotely rather than on the pipeline agent. For Azure environments, Azure Blob Storage is a common choice because it allows teams to centralize state, reduce the risk of drift, and support collaboration across multiple pipeline runs.

  • What are the most common issues when using Terraform with Azure DevOps?

    Common problems include misconfigured authentication, missing permissions, backend state issues, inconsistent Terraform versions, and pipeline stages that do not properly pass plan artifacts between jobs. Most of these issues can be reduced by standardizing versions, using remote state, and making the plan/apply workflow explicit.