In today's cloud-driven world, managing infrastructure efficiently is essential. This project focuses on using Terraform to set up and manage AWS resources. We use Amazon S3 to store the Terraform state file, keeping track of our infrastructure's current state securely.
The highlight of this project is the automation: by integrating GitLab CI/CD, the entire process of setting up infrastructure is automated, so no manual work is needed. This makes deploying resources faster, more reliable, and easier to manage.
PREREQUISITES:
AWS Account
Gitlab Account
Machine with terraform and AWS CLI installed
AWS Access Keys
Step 1: Writing terraform code for VPC and EC2:
We’ll organize our Terraform code using modules. We’ll create a VPC module to set up networking components and a Web module to provision EC2 instances for our web servers. These modules help keep our code clean, reusable, and scalable.
Following will be our file structure:
You can get the above code here: https://gitlab.com/basics4530332/terraform-cicd.git
In the above code you need to change the backend.tf with your created bucket name and your created DynamoDB table.
Step 2: Setting up the pipeline to provision infrastructure:
Here we will write the pipeline code to validate, plan, apply and destroy the AWS infrastructure.
Creating the variables for the access keys:
In your GitLab repository go to > Settings > CI/CD > select variables > Create variables for AWS access and secret access key.
Now we will be writing pipeline script for that create a file with the filename as .gitlab-ci.yml. Following will be the pipeline script.
image: name: registry.gitlab.com/gitlab-org/gitlab-build-images:terraform entrypoint: - '/usr/bin/env' - 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin' variables: AWS_ACCESS_KEY_ID: ${MY_AWS_KEY} AWS_SECRET_ACCESS_KEY : ${MY_AWS_ACCESS_KEY} AWS_DEFAULT_REGION: "ap-south-1" cache: paths: - .terraform before_script: - terraform --version - terraform init stages: - validate - plan - apply - destroy validate: stage: validate script: - terraform validate plan: stage: plan script: - terraform plan -out="planfile" dependencies: - validate artifacts: paths: - planfile apply: stage: apply script: - terraform apply -input=false "planfile" dependencies: - plan when: manual destroy: stage: destroy script: - terraform destroy --auto-approve when: manual
Image: Uses a Docker image with Terraform pre-installed to run the pipeline.
Variables: Sets up AWS credentials and the default region using environment variables (
MY_AWS_KEY
,MY_AWS_ACCESS_KEY
).Cache: Caches the
.terraform
directory to speed up future pipeline runs.Before Script: Ensures Terraform is initialized and ready by checking its version and running
terraform init
.Stages: Defines four stages:
validate
,plan
,apply
, anddestroy
.Validate: Checks the Terraform configuration for syntax errors using
terraform validate
.Plan: Generates an execution plan (
terraform plan
) and saves it asplanfile
. Thedependencies
keyword ensures this stage only runs after thevalidate
stage. Theartifacts
section stores theplanfile
for use in later stages.Apply: Applies the saved plan file to provision resources on AWS (
terraform apply
). Thedependencies
keyword makes sure this stage only runs after theplan
stage. Thewhen: manual
keyword ensures that the apply stage requires manual approval before it executes, providing control over when changes are applied.Destroy: This stage destroys the AWS infrastructure (
terraform destroy
) without needing confirmation (--auto-approve
). It’s also set to run manually withwhen: manual
, allowing you to destroy the infrastructure only when you decide to.
Conclusion:
This project shows how to use Terraform with GitLab CI/CD to automate AWS infrastructure management. By structuring our Terraform code with modules and automating the pipeline, we streamline deployments and reduce manual tasks. Manual approvals for critical stages ensure control and safety in our automation process, making it efficient and reliable.
For more insightful content on technology, AWS, and DevOps, make sure to follow me for the latest updates and tips. If you have any questions or need further assistance, feel free to reach out—I’m here to help!
Streamline, Deploy, Succeed-- Devops Made Simple!☺️