Creating Pipeline for ECS(Fargate) using blue-green deployment on AWS

Hello everyone, in this post I will show you the pipeline that I use for my ECS clusters. It will be a blue-green deployment since it works on production.

Here’s a quick diagram you can follow along.

So basically this is what’s gonna happen, whenever a user pushes a code to the repository(that could be any place but I used GitLab) it will be mirrored to the AWS CodeCommit, and then the code will be pushed to the CodeBuild for building our Docker image, after that, it will be pushed to our ECR. So from that point, CodePipeline will be looking for any changes constantly on our ECR, and whenever it detects a new version for our Docker image it will push that Docker image to the CodeDeploy as you can guess from that point CodeDeploy will push our Image to the Production Fargate Node.

I didn’t create steps for doing unit tests or integrity tests since I wanted to keep things easy to follow, it will be a basic deployment setup.

First things first, you will need Administrator access for your user or the maximum permissions for used services for your IAM user. You should have an ECR and a ready-to-use CodeCommit repository. A working Fargate cluster with a service, and load balancer with two target groups. If you don’t know how to do them you can follow these links:

https://docs.aws.amazon.com/AmazonECR/latest/userguide/repository-create.html
https://docs.aws.amazon.com/codecommit/latest/userguide/setting-up.html#setting-up-other
https://docs.aws.amazon.com/eks/latest/userguide/fargate.html
https://docs.aws.amazon.com/AmazonECS/latest/developerguide/load-balancer-types.html
https://aws.amazon.com/tr/premiumsupport/knowledge-center/create-alb-auto-register/

Stage 1 – Pushing to the CodeCommit

You don’t have to use CodeCommit for this pipeline, you can use your own code repository for the CodeBuild stage. I just wanted to see if that’s going to work or not. Since GitLab repository mirroring didn’t work from my end I created a basic .gitlab-ci.yml file to push our code to the CodeCommit. Here’s the file:

image: docker
stages:
  - deploy

deploy to production:
  stage: deploy
  when: manual
  script:
    - apk update && apk add git
    - git clone $SOURCE_REPO Github-to-ECS
    - cd Github-to-ECS
    - git branch -a
    - git pull
    - git push --force $SCHEME://$USERNAME:$SECRET@$REPO_URL

For this stage to work you should add the following environment variables to your GitLab account. Go to Settings > CI/CD > Variables and add the following variables:

REPO_URL #This will be your repository address on CodeCommit

SCHEME: https #You can use either HTTP to HTTP

SECRET #Enter the secret value for your CodeCommit repository user, if you don’t know how to do that you can follow this link: https://docs.aws.amazon.com/codecommit/latest/userguide/setting-up-gc.html

SOURCE_REPO: https://gitlab.com/enestos/github-to-ecs.git #This will be your repo where it will be mirrored

USERNAME #Username value you created for your CodeCommit usage

So don’t forget that I added the manual switch so it has to be manually triggered by you. If you don’t want to use that you can simply delete the when: manual field. Another thing to notice here you can add various git commands for pushing it from different branches.

Since our code is ready to be built by CodeBuild time to show how that’s done.


Stage 2 – Pushing to the CodeBuild

We will go to the CodeBuild > Create Build Projectand then type down your Project name, pick a source, it will be CodeCommit in our case, pick your repository, select your environment, in this step don’t forget to check the Privileged box since we will require privileged permission to create our Docker image. You can create a new service role which is highly recommended. Select Use a buildspec file for our build purposes. And then we can click on Create build project.

In this step CodeBuild will look for buildspec.yml file to create our Docker image. This will be our buildspec.yml file:

version: 0.2

phases:
  pre_build:
    commands:
      - echo "Logging in to Amazon ECR..."
      - chmod 666 /var/run/docker.sock
      - apt-get update -y
      #- apt-get install docker-ce docker-ce-cli containerd.io -y
      - apt-get install awscli -y 
      - apt-get upgrade awscli -y
      # - aws ecr get-login-password
      - aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin your-ECR-URI
  
  build:
    commands:
      - echo "Build started on `date`"
      - echo "Building the Docker image..."
      - docker build -t repo-name:latest .  
      - docker tag repo-name:latest your-ECR-URI:latest
      
  post_build:
    commands:
      - echo "Build completed on `date`"
      - echo "Pushing the Docker image..."
      - docker push your-ECR-URI/repo-name:latest

Stage 3 – Creating the CodeDeploy application

Go to the CodeDeploy > Applications and click on Create application Type down your application’s name and choose a compute platform, AWS ECS in our case. Click on create deployment group. Type down the deployment group name. Enter the service role’s name with the given CodeDeploy permission. Choose your ECS cluster and Service. Choose your load balancer, listener port, and target groups. Don’t forget that the load balancer has to be in the same subnet as the ECS cluster. And they might have to be in the same security group but it might change by your AWS security configurations. You can use various Deployment settings here, I like to use Reroute traffic immediately option but that’s totally up to you.

After creating our Docker image and pushing it to our ECR, time to create a CodePipeline to periodically check the new version of our image, and if there is a certain chance it will be pushed to the CodeDeploy.


Stage 4 – Creating the CodePipeline

Go to the CodePipeline and click on the Create pipeline button. Type down your pipeline’s name. Create a Service role, if you already created one before you can use that one. Leave the Advanced setting to default. For the source stage pick the ECR and type down your repository name with the given tag on the CodeBuild stage. Skip the build stage since we already builted our Docker image. For the deploy stage pick the ECS blue/green deployment and then type down the CodeDeploy application name and CodeDeploy deployment group. Leave the defaults as it is. Click create pipeline.

Here is the taskdef.json file for the CodePipeline stage:

{
    "executionRoleArn": "arn:aws:iam:IAM-USER-ID:role/ecsTaskExecutionRole",
    "containerDefinitions": [
        {
            "name": "sample-website",
            "image": "5your-ECR-URI/repo-name:latest",
            "essential": true,
            "portMappings": [
                {
                    "hostPort": 80,
                    "protocol": "tcp",
                    "containerPort": 80
                }
            ]
        }
    ],
    "requiresCompatibilities": [
        "FARGATE"
    ],
    "networkMode": "awsvpc",
    "cpu": "256",
    "memory": "512",
    "family": "ecs-demo"
}

Here is the appspec.yaml file:

version: 0.0
Resources:
  - TargetService:
      Type: AWS::ECS::Service
      Properties:
        TaskDefinition: <TASK_DEFINITION>
        LoadBalancerInfo:
          ContainerName: "sample-website"
          ContainerPort: 80

Thank you for taking your time and reading my post, If you have further questions don’t hesitate to comment here. Until next time!

References: https://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file-example.html#appspec-file-example-ecs

https://catalog.us-east-1.prod.workshops.aws/v2/workshops/869f7eee-d3a2-490b-bf9a-ac90a8fb2d36/en-US/1-introduction/workshop

https://docs.aws.amazon.com/codebuild/latest/userguide/sample-docker.html

https://cross-account-cicd-pipeline.workshop.aws/repository-stack/push-code.html

Related Posts