Back to Blog
AWSCodePipelineCI/CDDevOpsCodeBuildECS

AWS CodePipeline Tutorial: Complete CI/CD Guide for Cloud Engineers (2026)

April 12, 202620 min read

AWS CodePipeline Tutorial: Complete CI/CD Guide for Cloud Engineers (2026)

You've heard of GitHub Actions for CI/CD. But when you work at a company that's all-in on AWS, you'll encounter AWS CodePipeline — the native CI/CD service that integrates deeply with every other AWS service. Understanding CodePipeline is essential for AWS certifications (SAA-C03, DevOps Professional) and for working on AWS-native engineering teams.

This guide teaches you AWS CodePipeline, CodeBuild, and CodeDeploy from scratch — with a complete working example that builds and deploys a containerized application to ECS Fargate.

The AWS DevOps Toolchain

AWS has a suite of CI/CD services that work together:

ServiceRoleEquivalent
CodeCommitSource code repositoryGitHub/GitLab
CodeBuildBuild and test runnerGitHub Actions jobs / Jenkins
CodeDeployDeployment automationSpinnaker / Argo Rollouts
CodePipelineOrchestratorGitHub Actions workflows
CodeArtifactPackage registrynpm registry / PyPI
CodeStarFull project setup

In practice, most teams use GitHub or GitLab for source and CodePipeline + CodeBuild + CodeDeploy for the CI/CD pipeline. All four services integrate natively.

CodePipeline Concepts

A Pipeline has stages, and each stage has actions.

Source Stage → Build Stage → Test Stage → Deploy Stage
    ↓               ↓            ↓             ↓
  CodeCommit     CodeBuild    CodeBuild       CodeDeploy
  (or GitHub)   (build+test) (integration)   (to ECS)

Artifacts: The output of each stage is stored in an S3 bucket and passed to the next stage. If CodeBuild creates a Docker image, it outputs the image URI for CodeDeploy to use.

Approval Actions: You can insert a manual approval step between staging and production. An engineer reviews and clicks "Approve" in the console before production deployment proceeds.

Your First Pipeline: Building a Docker App

Let's build a complete pipeline that:

  1. Triggers on every push to the main branch in GitHub
  2. Runs CodeBuild to build and push a Docker image to ECR
  3. Deploys the new image to ECS Fargate with zero downtime

Step 1: ECR Repository

First, create an Elastic Container Registry to store your Docker images:

aws ecr create-repository \
  --repository-name my-app \
  --region us-east-1

# Output: arn:aws:ecr:us-east-1:123456789:repository/my-app
# Image URI: 123456789.dkr.ecr.us-east-1.amazonaws.com/my-app

Step 2: The CodeBuild buildspec.yml

The buildspec.yml file in your repository root tells CodeBuild what to do:

# buildspec.yml — place in repository root
version: 0.2

env:
  variables:
    AWS_DEFAULT_REGION: us-east-1
    ECR_REPOSITORY_URI: 123456789.dkr.ecr.us-east-1.amazonaws.com/my-app

phases:
  pre_build:
    commands:
      - echo "Logging in to Amazon ECR..."
      - aws ecr get-login-password --region $AWS_DEFAULT_REGION | docker login --username AWS --password-stdin $ECR_REPOSITORY_URI
      - COMMIT_HASH=$(echo $CODEBUILD_RESOLVED_SOURCE_VERSION | cut -c 1-7)
      - IMAGE_TAG=${COMMIT_HASH:=latest}
      - echo "Building image $ECR_REPOSITORY_URI:$IMAGE_TAG"

  build:
    commands:
      - echo "Running tests..."
      - npm test
      - echo "Building Docker image..."
      - docker build -t $ECR_REPOSITORY_URI:latest .
      - docker tag $ECR_REPOSITORY_URI:latest $ECR_REPOSITORY_URI:$IMAGE_TAG

  post_build:
    commands:
      - echo "Pushing Docker image to ECR..."
      - docker push $ECR_REPOSITORY_URI:latest
      - docker push $ECR_REPOSITORY_URI:$IMAGE_TAG
      - echo "Writing image definitions file..."
      - printf '[{"name":"my-app","imageUri":"%s"}]' $ECR_REPOSITORY_URI:$IMAGE_TAG > imagedefinitions.json

artifacts:
  files:
    - imagedefinitions.json

cache:
  paths:
    - '/root/.npm/**/*'
    - 'node_modules/**/*'

Key output: The imagedefinitions.json file tells CodeDeploy/ECS which image to deploy:

[{"name": "my-app", "imageUri": "123456789.dkr.ecr.us-east-1.amazonaws.com/my-app:a1b2c3d"}]

Step 3: Create the Pipeline with Terraform

# terraform/pipeline/main.tf

# S3 bucket for pipeline artifacts
resource "aws_s3_bucket" "pipeline_artifacts" {
  bucket = "my-app-pipeline-artifacts-${data.aws_caller_identity.current.account_id}"
}

resource "aws_s3_bucket_versioning" "pipeline_artifacts" {
  bucket = aws_s3_bucket.pipeline_artifacts.id
  versioning_configuration {
    status = "Enabled"
  }
}

# IAM role for CodePipeline
resource "aws_iam_role" "codepipeline" {
  name = "my-app-codepipeline-role"

  assume_role_policy = jsonencode({
    Version = "2012-10-17"
    Statement = [{
      Action    = "sts:AssumeRole"
      Effect    = "Allow"
      Principal = { Service = "codepipeline.amazonaws.com" }
    }]
  })
}

resource "aws_iam_role_policy" "codepipeline" {
  name = "my-app-codepipeline-policy"
  role = aws_iam_role.codepipeline.id

  policy = jsonencode({
    Version = "2012-10-17"
    Statement = [
      {
        Effect = "Allow"
        Action = ["s3:GetObject", "s3:GetObjectVersion", "s3:PutObject", "s3:GetBucketVersioning"]
        Resource = ["${aws_s3_bucket.pipeline_artifacts.arn}", "${aws_s3_bucket.pipeline_artifacts.arn}/*"]
      },
      {
        Effect   = "Allow"
        Action   = ["codebuild:BatchGetBuilds", "codebuild:StartBuild"]
        Resource = aws_codebuild_project.my_app.arn
      },
      {
        Effect   = "Allow"
        Action   = ["ecs:*", "iam:PassRole"]
        Resource = "*"
      },
      {
        Effect   = "Allow"
        Action   = ["codestar-connections:UseConnection"]
        Resource = aws_codestarconnections_connection.github.arn
      }
    ]
  })
}

# GitHub connection (one-time manual approval in console)
resource "aws_codestarconnections_connection" "github" {
  name          = "github-connection"
  provider_type = "GitHub"
}

# IAM role for CodeBuild
resource "aws_iam_role" "codebuild" {
  name = "my-app-codebuild-role"
  
  assume_role_policy = jsonencode({
    Version = "2012-10-17"
    Statement = [{
      Action    = "sts:AssumeRole"
      Effect    = "Allow"
      Principal = { Service = "codebuild.amazonaws.com" }
    }]
  })
}

resource "aws_iam_role_policy_attachment" "codebuild_ecr" {
  role       = aws_iam_role.codebuild.name
  policy_arn = "arn:aws:iam::aws:policy/AmazonEC2ContainerRegistryPowerUser"
}

resource "aws_iam_role_policy_attachment" "codebuild_s3" {
  role       = aws_iam_role.codebuild.name
  policy_arn = "arn:aws:iam::aws:policy/AmazonS3ReadOnlyAccess"
}

# CodeBuild project
resource "aws_codebuild_project" "my_app" {
  name          = "my-app-build"
  service_role  = aws_iam_role.codebuild.arn
  build_timeout = 20

  artifacts {
    type = "CODEPIPELINE"
  }

  environment {
    compute_type                = "BUILD_GENERAL1_SMALL"
    image                       = "aws/codebuild/standard:7.0"
    type                        = "LINUX_CONTAINER"
    image_pull_credentials_type = "CODEBUILD"
    privileged_mode             = true   # Required for Docker builds

    environment_variable {
      name  = "ECR_REPOSITORY_URI"
      value = aws_ecr_repository.my_app.repository_url
    }
  }

  source {
    type      = "CODEPIPELINE"
    buildspec = "buildspec.yml"
  }

  cache {
    type  = "LOCAL"
    modes = ["LOCAL_DOCKER_LAYER_CACHE", "LOCAL_SOURCE_CACHE"]
  }

  logs_config {
    cloudwatch_logs {
      group_name  = "/codebuild/my-app"
      stream_name = "build"
    }
  }
}

# The Pipeline itself
resource "aws_codepipeline" "my_app" {
  name     = "my-app-pipeline"
  role_arn = aws_iam_role.codepipeline.arn

  artifact_store {
    location = aws_s3_bucket.pipeline_artifacts.bucket
    type     = "S3"
  }

  # Stage 1: Source — pull from GitHub on every push to main
  stage {
    name = "Source"
    action {
      name             = "Source"
      category         = "Source"
      owner            = "AWS"
      provider         = "CodeStarSourceConnection"
      version          = "1"
      output_artifacts = ["source_output"]

      configuration = {
        ConnectionArn        = aws_codestarconnections_connection.github.arn
        FullRepositoryId     = "your-org/my-app"
        BranchName           = "main"
        DetectChanges        = true
      }
    }
  }

  # Stage 2: Build — CodeBuild builds Docker image and pushes to ECR
  stage {
    name = "Build"
    action {
      name             = "Build"
      category         = "Build"
      owner            = "AWS"
      provider         = "CodeBuild"
      version          = "1"
      input_artifacts  = ["source_output"]
      output_artifacts = ["build_output"]

      configuration = {
        ProjectName = aws_codebuild_project.my_app.name
      }
    }
  }

  # Stage 3: Deploy to Staging ECS cluster
  stage {
    name = "DeployStaging"
    action {
      name            = "DeployStaging"
      category        = "Deploy"
      owner           = "AWS"
      provider        = "ECS"
      version         = "1"
      input_artifacts = ["build_output"]

      configuration = {
        ClusterName = "staging-cluster"
        ServiceName = "my-app-staging"
        FileName    = "imagedefinitions.json"
      }
    }
  }

  # Stage 4: Manual approval before production
  stage {
    name = "Approve"
    action {
      name     = "Approve"
      category = "Approval"
      owner    = "AWS"
      provider = "Manual"
      version  = "1"

      configuration = {
        CustomData      = "Review staging deployment before approving production release."
        ExternalEntityLink = "https://staging.myapp.com/health"
      }
    }
  }

  # Stage 5: Deploy to Production ECS cluster
  stage {
    name = "DeployProduction"
    action {
      name            = "DeployProduction"
      category        = "Deploy"
      owner           = "AWS"
      provider        = "ECS"
      version         = "1"
      input_artifacts = ["build_output"]

      configuration = {
        ClusterName = "production-cluster"
        ServiceName = "my-app-production"
        FileName    = "imagedefinitions.json"
      }
    }
  }
}

Step 4: One-Time GitHub Connection Setup

The Terraform creates a pending connection to GitHub. You must approve it once in the AWS Console:

  1. Go to Developer Tools → Settings → Connections
  2. Find your connection (status: Pending)
  3. Click Update pending connection
  4. Authorize with GitHub
  5. Status changes to Available

After this, every push to main automatically triggers the pipeline.

Viewing Pipeline Status

# Get pipeline execution history
aws codepipeline list-pipeline-executions \
  --pipeline-name my-app-pipeline \
  --max-results 5

# Get detailed status of latest run
aws codepipeline get-pipeline-state \
  --name my-app-pipeline

# Watch CodeBuild logs in real time
aws logs tail /codebuild/my-app --follow

Blue/Green Deployments with CodeDeploy

For zero-downtime blue/green deployments on ECS:

# Use CodeDeploy instead of direct ECS deployments
resource "aws_codedeploy_app" "my_app" {
  name             = "my-app"
  compute_platform = "ECS"
}

resource "aws_codedeploy_deployment_group" "my_app" {
  app_name               = aws_codedeploy_app.my_app.name
  deployment_group_name  = "my-app-production"
  deployment_config_name = "CodeDeployDefault.ECSAllAtOnce"
  service_role_arn       = aws_iam_role.codedeploy.arn

  auto_rollback_configuration {
    enabled = true
    events  = ["DEPLOYMENT_FAILURE", "DEPLOYMENT_STOP_ON_ALARM"]
  }

  deployment_style {
    deployment_option = "WITH_TRAFFIC_CONTROL"
    deployment_type   = "BLUE_GREEN"
  }

  blue_green_deployment_config {
    deployment_ready_option {
      action_on_timeout = "CONTINUE_DEPLOYMENT"
    }
    terminate_blue_instances_on_deployment_success {
      action                           = "TERMINATE"
      termination_wait_time_in_minutes = 5
    }
  }

  ecs_service {
    cluster_name = aws_ecs_cluster.production.name
    service_name = aws_ecs_service.my_app.name
  }

  load_balancer_info {
    target_group_pair_info {
      prod_traffic_route {
        listener_arns = [aws_lb_listener.https.arn]
      }
      target_group {
        name = aws_lb_target_group.blue.name
      }
      target_group {
        name = aws_lb_target_group.green.name
      }
    }
  }
}

With CodeDeploy blue/green:

  1. New version deploys to the "green" target group
  2. CodeDeploy runs health checks on green
  3. Traffic shifts: 10% → 50% → 100% over configurable time
  4. If health checks fail, automatic rollback to blue
  5. Old "blue" containers terminate after the wait period

CodePipeline vs GitHub Actions: When to Use What

FactorAWS CodePipelineGitHub Actions
AWS integrationNative — no credentials neededRequires OIDC or secrets
SetupMore complex TerraformSimple YAML
Cost$1/pipeline/month + CodeBuild minutesFree tier then per-minute
Secrets managementAWS Secrets Manager nativeGitHub Secrets
VisibilityAWS Console, CloudWatchGitHub UI
Best forEnterprise AWS-only teamsOpen source, GitHub-native teams

Recommendation: Use GitHub Actions when your team is GitHub-native and wants simplicity. Use CodePipeline when you're in an enterprise AWS environment where security teams require all services to run inside your AWS account.

Common Mistakes and How to Avoid Them

Mistake 1: No artifact caching CodeBuild downloads all npm/pip packages every run. Enable caching:

cache:
  paths:
    - '/root/.npm/**/*'

Mistake 2: Not setting build timeouts Default timeout is 60 minutes. A stuck build wastes money. Set build_timeout = 20 in Terraform.

Mistake 3: No manual approval for production Never auto-deploy to production. Always include a manual approval stage.

Mistake 4: ECR image cleanup Without a lifecycle policy, ECR fills up with thousands of old images. Add:

resource "aws_ecr_lifecycle_policy" "my_app" {
  repository = aws_ecr_repository.my_app.name
  policy = jsonencode({
    rules = [{
      rulePriority = 1
      description  = "Keep last 30 images"
      selection    = { tagStatus = "any", countType = "imageCountMoreThan", countNumber = 30 }
      action       = { type = "expire" }
    }]
  })
}

CloudPath Academy DevOps Path

CloudPath Academy's Phase 3 covers the full DevOps engineering stack including:

  • Git workflows and branching strategies
  • CI/CD with GitHub Actions and AWS CodePipeline
  • Docker and container orchestration (ECS + Kubernetes)
  • Infrastructure as Code with Terraform
  • Blue/green and canary deployments
  • Security scanning in CI/CD (SAST, container scanning)
  • Production monitoring and incident response

Students who complete Phase 3 earn the CloudPath DevOps Engineer certificate — validated by completing real pipelines, not just reading about them.


*Ready to build production CI/CD pipelines? Start CloudPath Academy's DevOps Engineer phase today.*

Get more guides like this

Get weekly cloud engineering guides — delivered Sundays. No spam. Unsubscribe anytime.

Ready to start your cloud engineering career?

CloudPath Academy gives you real hands-on experience — Jira sprints, production deployments, Ask Atlas AI mentor, and certifications.

Start Free