Skip to content

Architecture

EcoGuard is built as a pipeline of data collection, analysis, storage, and presentation.

System layers

  1. External data sources
    • GitLab API for pipeline and job metadata
    • Electricity Maps for live carbon intensity
  2. Processing layer
    • collect_real_data.py
    • agent modules in agents
  3. Storage and serving
  4. Presentation layer
    • static dashboard files in public
  5. Automation layer
    • GitHub Actions and GitLab CI/CD workflows

Main data flow

  1. A commit, merge request, or scheduled job starts the pipeline.
  2. The collector fetches GitLab job data and external carbon intensity data.
  3. The agents convert runtime into energy and CO₂ estimates.
  4. The dashboard data agent writes JSON summaries.
  5. The backend or static site reads those JSON files.
  6. The dashboard displays trends, recommendations, and goals.

Important files

EcoGuard sustainability documentation