Deployment
Last updated: 2026-01-30
Overview
Deployment of the Monstermessenger application is fully automated using GitHub Actions. The CI/CD pipeline is configured to build, test, and deploy the application to Google Cloud Platform (GCP). The process distinguishes between development and production environments based on the Git branch, and it intelligently deploys only the components that have changed.
Target Architecture
The application is deployed as two separate but connected services on GCP, with distinct youth and adult variants for each environment.
- Backend: The Python FastAPI application is containerized using Docker and deployed to Google Cloud Run. Each variant (
youth,adult) runs as a separate Cloud Run service with its own environment variables and configuration. - Frontend: The React single-page application is deployed as a static website to Google Cloud Storage. Each variant has its own GCS bucket, configured to serve the static
index.htmlfile. - Environments:
- Development: Pushing to the
devbranch deploys to a development environment. Cloud Run services and GCS buckets for this environment are typically suffixed with-dev. - Production: Pushing to the
mainbranch deploys to the production environment.
- Development: Pushing to the
CI/CD Workflows
The deployment logic is managed by a set of GitHub Actions workflows located in the .github/workflows/ directory.
deploy-variants.yaml
This is the main workflow that orchestrates the entire deployment process.
- Trigger: It runs automatically on every push to the
devandmainbranches. - Path Filtering: The first crucial step is
check-changes, which uses thedorny/paths-filteraction. It analyzes the commit history to determine which parts of the application (backend,frontend,requirements) have been modified. This allows the workflow to skip unnecessary build and deploy steps, saving time and resources. - Multi-Stage Docker Build: The workflow uses an efficient multi-stage build process for the backend:
- Environment Image (
build-env-image): A base Docker image containing all Python dependencies is built usingdeployment/env.Dockerfile. This job only runs if therequirements.txtorenv.Dockerfileitself have changed. The resulting image is pushed to Google Artifact Registry and tagged aslatest. - Application Image: The final application image is built later, using the pre-built environment image as a base. This step is much faster as dependencies are already installed.
- Environment Image (
- Reusable Deploy Trigger: For each variant (
parentandchild), this workflow calls thereusable-deploy.yamlworkflow, passing the correct names, variants, and the results of the path filtering as parameters.
reusable-deploy.yaml
This reusable workflow contains the core logic for building and deploying a single variant of the application. It is called twice by deploy-variants.yaml, once for youth and once for adult.
- Authentication: Authenticates to GCP using a service account stored in GitHub Secrets.
- Build & Push Backend:
- If backend files have changed (based on the input from the parent workflow), it builds the final application Docker image using
deployment/backend.Dockerfile. - The
CHATBOT_VARIANTis passed as a build argument to configure the container correctly. - The image is pushed to Google Container Registry (GCR).
- If backend files have changed (based on the input from the parent workflow), it builds the final application Docker image using
- Deploy to Cloud Run:
- If the backend was built successfully, it deploys the new image to the corresponding Cloud Run service using
gcloud run deploy. - It sets necessary environment variables, such as
CHATBOT_VARIANTand theCORS_ORIGINSpointing to the frontend’s GCS bucket URL.
- If the backend was built successfully, it deploys the new image to the corresponding Cloud Run service using
- Build & Deploy Frontend:
- If frontend files have changed, it checks out the code, installs Node.js dependencies with
npm ci, and builds the React application withnpm run build:devorbuild:prod. - The backend’s Cloud Run URL is passed as the
VITE_API_URLenvironment variable during the build, connecting the frontend to its API. - The compiled static assets from the
frontend/dist/directory are synchronized to the correct Google Cloud Storage bucket usinggsutil rsync. - The GCS bucket is configured for public web access.
- If frontend files have changed, it checks out the code, installs Node.js dependencies with
build_quarto_docs.yaml
This is a separate workflow responsible for the deployment of this documentation website. On pushes to dev or devdocs, it automatically renders the Quarto project and deploys the resulting static site to Cloudflare Pages. It also contains a step to automatically update the “Last modified” date on each page based on its Git history.