Introduction

Terraform is an infrastructure as code tool that lets you define both cloud and on-prem resources in human-readable configuration files that you can version, reuse, and share. You can then use a consistent workflow to provision and manage all of your infrastructure throughout its lifecycle. Terraform can manage low-level components like compute, storage, and networking resources, as well as high-level components like DNS entries and SaaS features

Okay , but what does this all mean ? Let's say you need to create any given resource in any given cloud provider you can login to their managment console and create it via their web app ummmm ! pretty simple and straight forward right ! but now let's say you need to create multiple resources at different region at enterprise level , now it's complicated right ? here comes Terraform to rescue you with terraform you can automate your cloud infrastructure using codes and depoly and setup a ready to use cloud enviromment within miniutes . Now it's pretty straight forward XD !

Core Terraform workflow consists of three stages:

  • Init: In this stage terraform initializes a working directory containing Terraform configuration files and download required plugins and dependencies needed by given cloud providers . The syntax goes as below:
  • terraform init
    
  • Plan: At this stage terraform creates a execution plan and displays all the changes that is going to execute . The syntax goes as below:
  • terraform plan
    
  • Apply: At this final stage , terrform executes the actions proposed in a Terraform plan and deploy all purposed changes to your cloud . The syntax goes as below:
  • terraform apply
    
  • Destroy : The terraform destroy command is a convenient way to destroy all remote objects managed by a particular Terraform configuration. The syntax goes as below:
  • terraform destroy
    

Using Terraform Locally

Now the question is how to make use of this tool , let's see this in action . First we have to download and install terraform on our machine , download it from HERE now extract the zip and you will find a executable file inside it you can either Update your system's global path or start cmd from the same directory to start using terraform . Let's deploy an EC2 instance in AWS :-

  • Create a ".tf" file in same directory where you have extracted installation executable or in any dir. if you have updated system env. variables and open it in any code editor , now first we have to tell which cloud provider we are going to use so append following changes to demo.tf file :-
  • terraform {
        required_providers {
        aws = {
          source  = "hashicorp/aws"
          version = "~> 3.0"
        }
      }
    }
    provider "aws" {
      version = "~> 3.0"
      region  = "ap-south-1"
      access_key = "YOU CAN DOWNLOAD KEYS FROM AWS CONSOLE"
      secret_key = "YOU CAN DOWNLOAD KEYS FROM AWS CONSOLE"
    }
    
  • Now we have to append the changes to create EC2 instance , the minimal info we need to launch an instance is AMI id and Instance type for these you can navigate to AWS console and define here . So the final file will look like this :-
  • terraform {
      required_providers {
      aws = {
        source  = "hashicorp/aws"
        version = "~> 3.0"
      }
     }
    }
    provider "aws" {
    version = "~> 3.0"
    region  = "ap-south-1"
    access_key = "YOU CAN DOWNLOAD KEYS FROM AWS CONSOLE"
    secret_key = "YOU CAN DOWNLOAD KEYS FROM AWS CONSOLE"
    }
    resource "aws_instance" "demo" {
      ami = "ami-0b02eacf129bfac4e"
      instance_type = "t2.micro"
      tags = {
        "Name" = "demo"
      }
    }
    
  • FYI You don't need to worry about these codes and syntax , terraform has template registry from where you can find all resources , modules or configurations templates for supported cloud provider . HEAD OVER HERE
  • Now our Terraform script is ready to be executed , as mentioned earlier we have to execute those 3 stages sequently . First we need to launch cmd and terraform init hit enter after successfule execution terraform plan and finally terraform apply give desired input and the deployment will start !
  • Terraform With Gitlab CI

    Using terraform with gitlab will seamlessly improve our cloud automation , we can create reuse and share your scripts with our teams . The terraform stages will automatically be executed everytime you will make any changes to file but how we gonna do that ? So gitlab provides n no. of features and functionality needed for best DevOps Pratices . Follow below steps to create a project/repo and setup CI/CD pipeline for Terraform :-

  • Go to Gitlab dashboard and create a new project/repositories > choose blank project and create it in whatever namespace you like .
  • So we don't want our credentials to be exposed here by writing it in script file , to store it confidentially Goto Project Settings > CI/CD > Variable expand it and write it here .
  • Now we need to create/upload our terraform script here , for this you can upload the same above script to create EC2 instance and nuke access_key/secret_key lines
  • Next Step is to create pipeline create .gitlab-ci.yml and append below code to it , The below job will run on every push and pull request that happens on the master branch. In the build section, I have specified the image name and commands in the script section.
  • stages:
      - validate
      - plan
      - apply
    image:
      name: hashicorp/terraform:light
      entrypoint:
        - '/usr/bin/env'
        - 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
    before_script:
      - export AWS_ACCESS_KEY=${AWS_ACCESS_KEY_ID}
      - export AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
      - rm -rf .terraform
      - terraform --version
      - terraform init
    validate:
      stage: validate
      script:
        - terraform validate
    plan:
      stage: plan
      script:
        - terraform plan -out "planfile"
      dependencies:
        - validate
      artifacts:
        paths:
          - planfile
    apply:
      stage: apply
      script:
        - terraform apply -input=false "planfile"
      dependencies:
        - plan
      when: manual
    
  • Breakdown of Code : The below code is to declare the name of stages
  • stages:
      - validate
      - plan
      - apply
    
  • We are using terraform image to run the Terraform scripts and setting the entrypoint of Terraform path
  • image:
      name: hashicorp/terraform:light
      entrypoint:
        - '/usr/bin/env'
        - 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
    
  • In the before_script section we are exporting the AWS access key & secret key, checking the version of terraform, doing some cleanup stuff, and also initialize the working directory by running terraform init
  • before_script:
      - export AWS_ACCESS_KEY=${AWS_ACCESS_KEY_ID}
      - export AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
      - rm -rf .terraform
      - terraform --version
      - terraform init
    
  • In the plan section we are generating the terraform plan and store in the artifact so that we can use it in the next stage
  • plan:
      stage: plan
      script:
        - terraform plan -out "planfile"
      dependencies:
        - validate
      artifacts:
        paths:
          - planfile
    
  • In the apply section we are running terraform apply so that terraform can create resources on AWS , dependencies will make sure that apply stage will only run if plan stage goes successful
  • when: manualwill ensure that we manually need to trigger this stage
  • apply:
      stage: apply
      script:
        - terraform apply -input=false "planfile"
      dependencies:
        - plan
      when: manual
    
  • Now, As soon as you commit your workflow file GitLab will trigger the action and the resources will be going to create on the AWS account.
  • The validate and plan job will run automatically but you need to manually trigger the apply job
  • After running the job you will see that all the steps run perfectly and there was no error. So you will have passedwritten in the green color as each step job successfully.
  • That's it you have now learned to deploy resources on cloud via both locally and through Gitlab CI , you can now go ahead make custom changes and modify it accordingly , you can find the complete code here at my GitLab