Disaster Recovery & Backup Strategy Across Clouds

Introduction

Project Focus
This project focuses on establishing a Disaster Recovery (DR) & Backup strategy across multiple clouds. We will use:

  • AWS RDS Snapshots to back up databases
  • Azure Blob Storage and
  • Google Cloud Storage (GCS)


Why is it Useful in the Real World?

  • High Availability & Reliability: Storing backups in multiple clouds ensures that data remains accessible even if one cloud provider faces downtime.
  • Scalability: Each provider can auto-scale storage within its free tier limits.
  • Cost-Effective (Free Tier): We only use free-tier services and do not require extra credits, as each cloud offers a starter or free tier for low-volume usage.
  • Resilience & Portability: Having backups in different environments reduces the risk of data loss and offers flexibility if you need to migrate workloads.

Prerequisites

Required Tools & Accounts :

AWS Account with free-tier access
Make sure you have (or sign up for) the AWS Free Tier and that it is still active.


Azure Account with free-tier access
Sign up for the Azure Free Account if you do not have one yet.


Google Cloud Account with free-tier access
Use the Google Cloud Free Program. Ensure you select the “No automatic charges” options.


Installed Command-Line Tools on your local machine:
AWS CLI (for AWS actions)
Azure CLI (for Azure actions)
Google Cloud SDK (for GCP actions)


Permissions:
AWS: Owner or Editor permissions on your AWS account (can create and manage RDS and S3).
Azure: Subscription Owner or Contributor permissions (can create Blob Storage containers).
GCP: Owner or Editor permissions on your Google Cloud Project (can create buckets).


Enabled APIs & Services

  • AWS: Make sure RDS is available in your region, and S3 is enabled by default.
  • Azure: Ensure the Storage resource provider is registered (usually default).
  • GCP: Enable the Cloud Storage API from the Console or via gcloud services enable storage.googleapis.com.

Step-by-Step Implementation

We will demonstrate both the Console (GUI) approach and the Command-Line (CLI) approach for each major step. Each step includes explicit references to where you should take screenshots, as requested.

Setting Up AWS RDS & Creating a Snapshot :

Console (GUI) Steps

Log in to AWS Console Navigate to https://aws.amazon.com/

Create a Free-Tier RDS Database (if you do not already have one) :Go to RDS service. Click Create database. Select MySQL or PostgreSQL (free-tier eligible). Choose db.t2.micro or db.t4g.micro (whichever is available under free tier).

Create an RDS Snapshot :In the RDS console, select your DB instance. Choose ActionsTake Snapshot. Give it a name (e.g., my-free-tier-db-snapshot).

Verify Snapshot Go to Snapshots in the RDS console. Confirm the snapshot status is Available.

CLI Steps (AWS) :

Below are equivalent CLI commands. We also explain each command:

Configure AWS CLI (if not already) aws configureWhat it does: Prompts you to enter your AWS Access Key, Secret Key, region, and default output format so all subsequent AWS CLI commands know which account and region to operate in.

Create RDS Snapshot :aws rds create-db-snapshot \ --db-snapshot-identifier my-free-tier-db-snapshot \ --db-instance-identifier my-free-tier-db

What it does: Creates a snapshot named my-free-tier-db-snapshot for an RDS instance called my-free-tier-db (change to your actual DB instance identifier).

Verification: A JSON output is returned with the snapshot details. You can then run: aws rds describe-db-snapshots --db-snapshot-identifier my-free-tier-db-snapshot

to confirm its status is available.

Copying Snapshots to AWS S3 (Export / Store) :

Console (GUI) Steps

Go to S3
From the AWS console, open S3.

Create a Bucket
Click Create bucket.
Name it something like my-multi-cloud-backups.
Choose a region (keep it consistent with your RDS region if possible, for faster transfers).

Export RDS Snapshot to S3
Return to RDSSnapshots.
Select your snapshot → ActionsExport Snapshot (AWS calls it “Export to Amazon S3” in the UI).
Configure export settings, choosing your newly created S3 bucket.

Verify Export
Go to S3 → open the bucket → you should see a folder or object named after your snapshot export.

CLI Steps (AWS) :

Create S3 Bucket

aws s3 mb s3://my-multi-cloud-backups


What it does: Creates a new S3 bucket named my-multi-cloud-backups.


Note: Bucket names must be globally unique in S3; change the name if you get an error.


Export RDS Snapshot to S3

aws rds export-db-snapshot \

--export-task-identifier my-rds-snapshot-export \

--source-db-snapshot-identifier my-free-tier-db-snapshot \

--s3-bucket-name my-multi-cloud-backups \

--iam-role-arn arn:aws:iam::<YOUR_ACCOUNT_ID>:role/service-role/AWSS3ExportRole


What it does: Initiates an export of your RDS snapshot to the specified S3 bucket. You need an IAM role with the right permissions (refer to AWS docs for Export to S3 role creation if not already done).


Verification: You will get a JSON output. Check export status:

aws rds describe-export-tasks --export-task-identifier my-rds-snapshot-export


Once status is complete, your data is in S3.

Replicating Backup from AWS S3 to Azure Blob Storage :

Console (GUI) Steps (Azure)

Log in to Azure Portal Go to https://portal.azure.com/

Create a Storage Account In the search bar, type Storage accountsCreate. Resource group: create or reuse (e.g., MultiCloudRG). Storage account name: mymulticloudstorage (must be unique). Screenshot: Just before hitting Review + create.

Create a Blob Container In your new storage account, go to Containers+ Container. Name: backups.

Manually Upload from S3 to Azure Option A (download & upload): Download from AWS S3 console. Then in Azure Portal → open the backups container → Upload to Azure.

(Manual cross-cloud copying in the browser is not ideal for large backups, but it is the simplest for demonstration. In production, we’d use CLI or automation scripts.)

CLI Steps (Azure + AWS)

To automate S3 → Azure Blob copy, you can use either:

  • The Azure CLI (az storage blob upload), or

  • Tools like rclone or AWS CLI to first download and re-upload.

Example with rclone (popular cross-cloud copy tool):

Configure rclone for AWS S3 (will prompt for credentials)

rclone config

Copy data from S3 to Azure

rclone copy s3:my-multi-cloud-backups azure:mymulticloudstorage/backups


  • What it does: Copies all objects from my-multi-cloud-backups in AWS S3 to the container named backups inside mymulticloudstorage in Azure.
  • Verification: rclone will show each file being transferred. You can also check the Azure Portal to see your new objects.

(If you prefer using az storage CLI directly, you would first download from S3 to local, then run az storage blob upload to the Azure container. The principle is the same.)

Replicating from Azure to Google Cloud Storage (GCS) :

Console (GUI) Steps (Google Cloud)

Log in to Google Cloud Console Navigate to https://console.cloud.google.com/

Create a Storage Bucket In the console, select Navigation MenuCloud StorageBrowserCreate Bucket. Name: my-multicloud-gcs-backups (unique name).

Upload from Azure to GCS Option A (manual): Download from Azure Portal, then upload to GCS console.

CLI Steps (Google Cloud + Azure) :

Again, you can use a tool like rclone or a manual two-step approach. For direct GCP CLI usage, you must have the backup locally first, then run:

If not already authenticated:

gcloud auth login


Copy backup from local to GCS

gsutil cp my-exported-snapshot-file gs://my-multicloud-gcs-backups/


What it does: Uses gsutil (Google Cloud Storage CLI tool) to copy a local file into the specified GCS bucket.


Verification: Run:

gsutil ls gs://my-multicloud-gcs-backups/


to confirm the file is there.

Verifying and Testing the Project

  • AWS: Check the RDS snapshot is successfully created and exported (either in S3 console or via CLI).

  • Azure: Confirm that the backup file appears in your Blob container.

  • GCP: Ensure the backup file is present in your GCS bucket.

  • Optional Testing: Perform a small test restore from each cloud to a local environment or a test environment. If your snapshots are for a database, spin up a test instance in the respective cloud to confirm the backup is valid.

Common Issues and Troubleshooting

  • Insufficient Permissions / IAM Roles:
    Make sure you have the correct roles to export RDS snapshots to S3 and to upload data to Azure and GCS.
  • Bucket / Container Name Conflicts:
    Names must be globally unique for AWS S3 and GCS. If you get an error, try another name.
  • Region Mismatch:
    Exporting snapshots across different regions or copying from one region to another might incur complexity. Stick to the same or nearby regions to reduce latency.
  • Exceeding Free Tier Limits:
    Watch your usage to ensure you stay within free limits. If you go beyond the free tiers (especially in data storage size), charges may apply.
  • Large File Transfers:
    Manual downloads and uploads can time out if the file is big. Use CLI tools like rclone or gsutil that support resumable uploads for reliability.

Conclusion

We have successfully implemented a multi-cloud Disaster Recovery & Backup strategy using AWS RDS Snapshots, Azure Blob Storage, and Google Cloud Storage—all under each provider’s free tier. We created a backup in AWS, replicated it to Azure, and finally stored a copy in Google Cloud. Along the way, we learned how to:

  • Set up and manage RDS snapshots,

  • Export data to AWS S3,

  • Transfer backups to Azure Blob Storage and then to GCS,

  • Utilize both console (GUI) and CLI approaches,

  • Troubleshoot common issues like permissions or region mismatches.

By following these steps, we ensure a cost-effective and resilient approach to safeguarding our data.

What is Cloud Computing ?

Cloud computing delivers computing resources (servers, storage, databases, networking, and software) over the internet, allowing businesses to scale and pay only for what they use, eliminating the need for physical infrastructure.


  • AWS: The most popular cloud platform, offering scalable compute, storage, AI/ML, and networking services.
  • Azure: A strong enterprise cloud with hybrid capabilities and deep Microsoft product integration.
  • Google Cloud (GCP): Known for data analytics, machine learning, and open-source support.