FullStack DevOps Project

Ujwal Pachghare 🌟
5 min readNov 16, 2024

--

Git, GitHub Actions, Kubernetes, Docker, Docker-Compose, Terraform, AWS, Prometheus, Grafana, Nginx

# Used Tools Versions

python 3.10.12
django 5.0.2
node v18.20.5
npm 10.8.2
docker 24.0.7
docker-compose 2.24.6
k3s 1.30.5
helm 3.16.3
nginx 1.24.0

# Database Backup and Restore Scripts

db_backup.sh

#!/bin/bash

# Configuration
DB_PATH="../backend/db.sqlite3"
BACKUP_DIR="./backups"
DATE=$(date +%Y-%m-%d_%H-%M-%S)

# Make the backup directory if it doesn't exist
sudo mkdir -p $BACKUP_DIR

# Backup the SQLite db by copying it to the backup folder with a timestamp
sudo cp $DB_PATH $BACKUP_DIR/db_backup_$DATE.sqlite3

# Clean up backups older than 7 days
sudo find $BACKUP_DIR -type f -name "*.sqlite3" -mtime +7 -exec rm -f {} \;

echo "Backup completed: $BACKUP_DIR/db_backup_$DATE.sqlite3"

# Crontab Configuration
# chmod +x db_backup.sh
# crontab -e
# 0 3 * * * db_backup.sh - running on 3AM everyday

db_restore.sh

#!/bin/bash

# Configuration
DB_PATH="../backend/db.sqlite3"
BACKUP_DIR="./backups"

# Check if a backup file was provided
if [ -z "$1" ]; then
echo "Usage: $0 <backup_file>"
exit 1
fi

BACKUP_FILE="$1"

# Check if the backup file exists
if [ ! -f "$BACKUP_FILE" ]; then
echo "Error: Backup file '$BACKUP_FILE' not found."
exit 1
fi

# Restore the database
cp "$BACKUP_FILE" "$DB_PATH"
echo "Database restored from '$BACKUP_FILE' to '$DB_PATH'."

# Run migrations (optional)
cd ../backend/
python manage.py migrate

## Main GitHub Workflow

This GitHub Actions workflow automates the building, testing, and deployment of a ChatGPT application with Docker, Helm, and Kubernetes, including code quality checks and vulnerability scans.

Key Features:

  • SonarQube Analysis: Static code analysis to detect code quality issues.
  • PM2: Used to manage both backend and frontend processes.
  • Trivy: Vulnerability scanner to check for critical/high vulnerabilities in the Docker images.
  • Docker & Helm: Docker images are built, pushed to Docker Hub, and Kubernetes Helm charts are updated accordingly.

Triggers:

  • Push events to main, ujwal-devops, and ujwal-cicd branches, affecting paths:
  • .github/workflows/main.yml
  • backend/**, frontend/**, images/**, prompts/**
  • Pull Request events of type synchronize
  • Manual trigger via workflow_dispatch

Steps:

  1. SonarQube-Analysis:
  • Runs SonarQube code analysis using the SonarCloud action, setting up necessary environment variables from secrets.

2. Build-App-Backend:

  • Builds the backend app by setting up the Python environment, installing dependencies, running migrations, collecting static files, and starting the app using pm2.

3. Build-App-Frontend:

  • Builds the frontend app by setting up the Node environment, installing dependencies, building static files, and starting the app using pm2.

4. Image-Vuln-Check-Backend:

  • Builds a Docker image for the backend and runs Trivy to scan for security vulnerabilities (CRITICAL and HIGH severity). The scan result is uploaded as an artifact.

5. Image-Vuln-Check-Frontend:

  • Builds a Docker image for the frontend and performs the same vulnerability scan using Trivy. The report is also uploaded as an artifact.

6. Push-To-DockerHub-Backend:

  • Pushes the backend Docker image to Docker Hub with a version tag based on the GitHub run number.

7. Push-To-DockerHub-Frontend:

  • Pushes the frontend Docker image to Docker Hub with a similar version tag.

8. Update-Helm-Chart-Backend:

  • Updates the Helm chart for the backend with the new Docker image tag and commits the changes to the repository.

9. Update-Helm-Chart-Frontend:

  • Updates the Helm chart for the frontend and commits the new image tag to the repository.

Configuration:

  • Ubuntu 24.04 runners are used for most jobs.
  • Backend and Frontend jobs are dependent on successful SonarQube analysis.
  • Docker Buildx and Trivy are used for image building and security scanning.
  • Permissions: Requires write access to actions and contents.

Environment Variables and Secrets:

— Secrets:

  • SONAR_PROPERTIES: Properties for SonarQube analysis.
  • GITHUB_TOKEN, SONAR_TOKEN, DOCKERHUB_TOKEN: For authentication with GitHub, SonarCloud, and Docker Hub respectively.
  • BACKEND_ENV, FRONTEND_ENV: Environment variable files for backend and frontend.
  • GH_USER_MAIL, GH_USER_NAME: GitHub user information for committing changes.

— Variables:

  • DOCKERHUB_USERNAME: Username for Docker Hub.

## Docker-Compose Deployment Workflow

This GitHub Actions workflow automates the deployment of a Docker Compose-based application. It is triggered on specific push events, pull request synchronizations, and manual triggers.

Triggers:

  • Push events to ujwal-cicd and ujwal-docker branches, affecting paths:
  • .github/workflows/docker.yml
  • backend/**
  • frontend/**
  • Pull Request events of type synchronize
  • Manual trigger via workflow_dispatch

Steps:

  1. Checkout Code: Fetches the latest code from the repository.
  2. Setup Environment Variables: Writes secret environment variables to local .env files for both frontend and backend from GitHub secrets.
  3. Destroy Previous Deployment: Stops any running Docker Compose containers with docker compose down.
  4. Apply New Deployment: Starts the new deployment using docker compose up -d.
  5. Restart Nginx: Restarts the Nginx service to apply new changes.

Configuration:

  • Runs on a self-hosted runner.
  • Default shell for steps is set to bash.
  • Requires actions and contents permissions (read).

## Terraform Deployment Workflow

This GitHub Actions workflow automates the deployment of infrastructure using Terraform. It is triggered on specific push events, pull request synchronizations, and manual triggers.

Triggers:

  • Push events to ujwal-cicd and ujwal-tf branches, affecting paths:
  • .github/workflows/tf.yml
  • tf/files/**
  • Pull Request events of type synchronize
  • Manual trigger via workflow_dispatch

Environment Variables:

  • AWS_REGION: From workflow variables
  • AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY: From GitHub secrets

Steps:

  1. Checkout Code: Fetches the latest code from the repository.
  2. Setup Terraform: Installs the specified Terraform version (1.9.8).
  3. Terraform fmt: Checks the formatting of Terraform files.
  4. Terraform Init: Initializes the Terraform configuration.
  5. Terraform Validate: Validates the Terraform configuration files.
  6. Terraform Plan: Generates an execution plan for applying changes, using the variables.tfvars file.
  7. Terraform Apply: Applies the Terraform plan if the commit message contains ‘Apply’.
  8. Terraform Destroy: Destroys the infrastructure if the commit message contains ‘Destroy’.

Configuration:

  • Runs on Ubuntu 24.04.
  • Default shell for steps is set to bash, with the working directory set to tf/files/.
  • Requires actions and contents permissions (read), and pull-requests permissions (write).

## Chatgpt App Frontend and Backend Deployment with ArgoCD

Chatgpt-Frontend Deployment with ArgoCD

Chatgpt-Backend Deployment with ArgoCD

## Prometheus Metrics

CPU Usage

Memeory Usage

## Grafana Dashboard

Application CPU and Memery usage

All Processes and Containers CPU Usage

All Processes and Container Memory Usage

--

--

Ujwal Pachghare 🌟
Ujwal Pachghare 🌟

Written by Ujwal Pachghare 🌟

DevOps Engineer |🛠️Troubleshooter | 🤖Automation Lover |💡Problem Solver |📚Lifelong Learner

No responses yet