Deployment

Last updated: 2026-01-30

Overview

Deployment of the Monstermessenger application is fully automated using GitHub Actions. The CI/CD pipeline is configured to build, test, and deploy the application to Google Cloud Platform (GCP). The process distinguishes between development and production environments based on the Git branch, and it intelligently deploys only the components that have changed.

Target Architecture

The application is deployed as two separate but connected services on GCP, with distinct youth and adult variants for each environment.

  • Backend: The Python FastAPI application is containerized using Docker and deployed to Google Cloud Run. Each variant (youth, adult) runs as a separate Cloud Run service with its own environment variables and configuration.
  • Frontend: The React single-page application is deployed as a static website to Google Cloud Storage. Each variant has its own GCS bucket, configured to serve the static index.html file.
  • Environments:
    • Development: Pushing to the dev branch deploys to a development environment. Cloud Run services and GCS buckets for this environment are typically suffixed with -dev.
    • Production: Pushing to the main branch deploys to the production environment.

CI/CD Workflows

The deployment logic is managed by a set of GitHub Actions workflows located in the .github/workflows/ directory.

deploy-variants.yaml

This is the main workflow that orchestrates the entire deployment process.

  • Trigger: It runs automatically on every push to the dev and main branches.
  • Path Filtering: The first crucial step is check-changes, which uses the dorny/paths-filter action. It analyzes the commit history to determine which parts of the application (backend, frontend, requirements) have been modified. This allows the workflow to skip unnecessary build and deploy steps, saving time and resources.
  • Multi-Stage Docker Build: The workflow uses an efficient multi-stage build process for the backend:
    1. Environment Image (build-env-image): A base Docker image containing all Python dependencies is built using deployment/env.Dockerfile. This job only runs if the requirements.txt or env.Dockerfile itself have changed. The resulting image is pushed to Google Artifact Registry and tagged as latest.
    2. Application Image: The final application image is built later, using the pre-built environment image as a base. This step is much faster as dependencies are already installed.
  • Reusable Deploy Trigger: For each variant (parent and child), this workflow calls the reusable-deploy.yaml workflow, passing the correct names, variants, and the results of the path filtering as parameters.

reusable-deploy.yaml

This reusable workflow contains the core logic for building and deploying a single variant of the application. It is called twice by deploy-variants.yaml, once for youth and once for adult.

  1. Authentication: Authenticates to GCP using a service account stored in GitHub Secrets.
  2. Build & Push Backend:
    • If backend files have changed (based on the input from the parent workflow), it builds the final application Docker image using deployment/backend.Dockerfile.
    • The CHATBOT_VARIANT is passed as a build argument to configure the container correctly.
    • The image is pushed to Google Container Registry (GCR).
  3. Deploy to Cloud Run:
    • If the backend was built successfully, it deploys the new image to the corresponding Cloud Run service using gcloud run deploy.
    • It sets necessary environment variables, such as CHATBOT_VARIANT and the CORS_ORIGINS pointing to the frontend’s GCS bucket URL.
  4. Build & Deploy Frontend:
    • If frontend files have changed, it checks out the code, installs Node.js dependencies with npm ci, and builds the React application with npm run build:dev or build:prod.
    • The backend’s Cloud Run URL is passed as the VITE_API_URL environment variable during the build, connecting the frontend to its API.
    • The compiled static assets from the frontend/dist/ directory are synchronized to the correct Google Cloud Storage bucket using gsutil rsync.
    • The GCS bucket is configured for public web access.

build_quarto_docs.yaml

This is a separate workflow responsible for the deployment of this documentation website. On pushes to dev or devdocs, it automatically renders the Quarto project and deploys the resulting static site to Cloudflare Pages. It also contains a step to automatically update the “Last modified” date on each page based on its Git history.