Multi-Cloud CI\/CD Pipeline for Microservices

Introduction

Project Overview This project focuses on creating a multi-cloud CI/CD pipeline that automates the build, test, and deployment of containerized microservices to three major cloud providers (AWS, Azure, and Google Cloud). By using each provider’s native CI/CD service—AWS CodePipeline, Azure DevOps, and Google Cloud Build—we gain flexibility and insight into how to run workloads in different environments. Infrastructure as Code (IaC) tools such as Terraform or Pulumi will be used to provision and manage cloud resources in a unified manner.

Why Is It Useful in the Real World?

  • Scalability: Each platform offers automatic scaling options (e.g., ECS, AKS, GKE).

  • Portability: Containerized microservices can be moved seamlessly between clouds.

  • Cost Efficiency: By using each provider’s free-tier offerings, we pay only for consumption that exceeds the free tier.

  • Disaster Recovery & Flexibility: Multi-cloud setups minimize vendor lock-in and improve resilience.

Prerequisites

Below is everything you must have before starting. We will focus on free tiers to ensure no or minimal charges:


Required Tools & Accounts

AWS Account with free-tier eligibility.
Must have AWS CLI installed locally.
Must have AWS CodePipeline, AWS CodeBuild, and Amazon ECS APIs enabled.
Permissions: AWS IAM user with AdministratorAccess or the relevant CodePipeline/CodeBuild/ECS permissions.

Azure Account with free-tier credits (no additional charges if staying within the free limits).
Must have the Azure CLI installed locally.
Must enable Azure DevOps and have permission to create Projects and Pipelines.
Permissions: Owner or Contributor rights in your Azure Subscription.

Google Cloud Account with free-tier usage (no credits needed if usage is minimal).
Must have the gcloud CLI (Google Cloud SDK) installed locally.
Must enable Google Cloud Build, Google Kubernetes Engine (GKE), and Container Registry APIs.
Permissions: Owner or Editor in your Google Cloud Project.

Docker installed locally for container image building and testing.

Terraform or Pulumi installed locally to handle Infrastructure as Code.

Git for source control.


Important: While each cloud’s free tier should suffice for small-scale tests, always verify your usage to avoid unexpected charges.

Step-by-Step Implementation

We will structure our steps so that each part can be done via Console (GUI) or via Terminal (CLI). Each step includes:


Action (What are we doing?)

Console Method (How to do it in the cloud provider’s web console)

Terminal Command (Equivalent CLI command, with an explanation of what it does)


Set Up Your Local Environment

Action: Configure your AWS, Azure, and Google Cloud CLI tools.
Console Method (Not applicable, as CLI setup is local, but you’ll see some configuration in each cloud console under “IAM & Admin,” “Access Management,” or similar if you wish to confirm credentials).
Terminal Command

Explanation: This prompts for your AWS Access Key, Secret Key, default region, etc.

Explanation: Opens a browser window to sign into your Azure account.

Explanation: Guides you through selecting your Google Cloud project and default region.

Action: Install Terraform or Pulumi.

  • Console Method: N/A (installation is purely local).
  • Terminal Command (Terraform example)


sudo apt-get update && sudo apt-get install -y gnupg software-properties-common

curl -fsSL https://apt.releases.hashicorp.com/gpg | sudo gpg --dearmor -o /usr/share/keyrings/hashicorp.gpg

echo "deb [signed-by=/usr/share/keyrings/hashicorp.gpg] https://apt.releases.hashicorp.com $(lsb_release -cs) main" | sudo tee /etc/apt/sources.list.d/hashicorp.list

sudo apt-get update && sudo apt-get install terraform


Explanation: Adds HashiCorp’s official repository and installs the Terraform CLI.


Prepare a Sample Microservice

Action: Create or clone a sample microservice (a simple “Hello World” Node.js or Python app) and place it in a Git repository.
Console Method:
Go to your Git hosting platform (e.g., GitHub, GitLab, or Azure Repos).
Create a new repository named multi-cloud-microservice.
Upload your app.js (or main.py) and Dockerfile.
Terminal Command :

Explanation: Clones the repo locally, adds your microservice files, and pushes them to the remote repository.



Provision Infrastructure via Terraform (or Pulumi)

Below is an example using Terraform. We will create minimal resources in each cloud to keep this within free tiers.


Action: Write or download a Terraform template that creates:
An ECS cluster on AWS,
An AKS cluster on Azure,
A GKE cluster on GCP,
Corresponding container registries (ECR on AWS, ACR on Azure, and GCR on GCP),
And enables the required APIs.
Console Method: You can check your resources in each provider’s console (e.g., AWS > ECS > Clusters, Azure > Kubernetes services, GCP > Kubernetes Engine) after applying your Terraform plan.
Terminal Command:

Explanation:

  • terraform init: Initializes the Terraform working directory, downloads provider plugins.
  • terraform plan: Shows what resources Terraform will create.
  • terraform apply: Provisions the resources in your AWS, Azure, and GCP accounts.



Configure CI/CD Pipelines

We will create separate pipelines in AWS, Azure, and GCP. Each pipeline should pull the same repository but build and deploy to its respective environment.



AWS CodePipeline + CodeBuild

Action: Create a pipeline in AWS CodePipeline that pulls code from your Git repository, uses CodeBuild to build a Docker image, pushes it to ECR, and deploys to ECS.
Console Method:
Go to AWS ConsoleCodePipelineCreate Pipeline.
Set the pipeline name, e.g., Microservice-Pipeline-AWS.
Source Provider: choose your Git platform (GitHub, AWS CodeCommit, etc.).
Build Provider: AWS CodeBuild – create a new build project or select an existing one.
Deploy Provider: Amazon ECS – configure your ECS service.
Terminal Command (simplified example via AWS CLI):

aws codepipeline create-pipeline --cli-input-json file://pipeline-definition.json


Explanation:

  • This command reads a JSON file that defines your pipeline (stages, input sources, etc.) and creates the pipeline.
  • In pipeline-definition.json, you specify your Git source, CodeBuild config, and ECS deploy stage.



Azure DevOps Pipeline

Action: Use Azure DevOps to build and deploy your Docker image to Azure Container Registry, then to an AKS cluster.
Console Method:
Go to Azure DevOpsPipelinesNew Pipeline.
Select your repo (GitHub or Azure Repos).
Use a YAML pipeline or classic editor.
If YAML: define your stages to build and push the container image to ACR, then deploy to AKS.
Terminal Command:

az pipelines create --name "Microservice-Pipeline-Azure" --repository <your-repo-url> --branch main --yaml-path /pipelines/azure-pipeline.yml

Explanation:

  • Creates an Azure DevOps pipeline from a local or remote YAML definition.



Google Cloud Build

Action: Set up Google Cloud Build to pull code from your Git repository, build and push your image to Google Container Registry (GCR), then deploy to GKE.
Console Method:
Go to Google Cloud ConsoleCloud BuildTriggersCreate Trigger.
Select your Git repository (GitHub or Cloud Source Repositories).
Provide a cloudbuild.yaml specifying build steps, e.g., docker build, docker push, gcloud container clusters deploy.
Terminal Command:

Explanation:

  • Creates a Cloud Build trigger that listens for commits on the main branch and executes the steps defined in cloudbuild.yaml.


Consolidate or “Unify” the Pipelines (Optional)

Some teams prefer a single pipeline orchestrator (e.g., Jenkins or GitHub Actions) that triggers sub-pipelines on each cloud. This is more advanced and requires bridging tools. However, the principle remains the same: each pipeline can function independently in its respective cloud environment.

Verifying and Testing the Project

Check Pipeline Status Ensure each pipeline (AWS, Azure, GCP) completes successfully without errors.

Access the Deployed Microservices For ECS: get the load balancer DNS or the public IP of the ECS service. For AKS: get the external IP from the AKS service. For GKE: get the external IP from your GKE service. Visit the URL or IP in a browser. You should see your “Hello World” or test endpoint response.

Logs and Metrics AWS: Check CloudWatch logs for ECS tasks and CodeBuild. Azure: Check Azure Monitor or Logs for the container logs. GCP: Check Cloud Logging for the GKE or Cloud Build logs.

Common Issues and Troubleshooting

  • Insufficient IAM Permissions
    Make sure you have the correct roles (Administrator/Owner/Editor or specialized roles for pipelines and container deployments).
  • API Not Enabled
    On GCP, confirm you have Cloud Build, Container Registry, and GKE APIs enabled.
    On AWS, ensure ECS, ECR, CodePipeline, and CodeBuild are activated in your region.
  • Docker Build Failures
    Check Dockerfile syntax and ensure the correct base image.
    Validate environment variables (ENV) or ports (EXPOSE).
  • Terraform State Conflicts
    If multiple people run terraform apply simultaneously, you may get state lock issues. Use Terraform Cloud or a remote state store for collaboration.
  • Deployment Failures
    Pipeline might succeed but the microservice might fail on the cluster if environment variables or resource limits are off. Check your cluster logs.

Conclusion

We have successfully built a Multi-Cloud CI/CD Pipeline for containerized microservices using AWS, Azure, and Google Cloud. We have learned how to:


  • Configure local environments for multiple cloud providers
  • Set up Infrastructure as Code with Terraform or Pulumi
  • Create pipelines in AWS CodePipeline, Azure DevOps, and Google Cloud Build
  • Deploy to Amazon ECS, Azure AKS, and Google GKE in a repeatable and automated way


By leveraging free-tier accounts and minimal resource usage, this entire project can be maintained with no cost or very low cost. We hope this guide helps teams and individuals explore multi-cloud strategies, improve resilience, and avoid vendor lock-in. We have gained valuable experience in orchestrating builds and deployments across different platforms, which is a highly sought-after skill in modern DevOps.

What is Cloud Computing ?

Cloud computing delivers computing resources (servers, storage, databases, networking, and software) over the internet, allowing businesses to scale and pay only for what they use, eliminating the need for physical infrastructure.


  • AWS: The most popular cloud platform, offering scalable compute, storage, AI/ML, and networking services.
  • Azure: A strong enterprise cloud with hybrid capabilities and deep Microsoft product integration.
  • Google Cloud (GCP): Known for data analytics, machine learning, and open-source support.