Skip to main content

Overview

This example demonstrates how to use Geval to monitor API performance metrics and enforce performance requirements.

Contract

version: 1
name: performance-gate
description: API performance requirements

sources:
  csv:
    metrics:
      - column: latency_ms
        aggregate: p95
        as: latency_p95
      - column: throughput
        aggregate: avg
      - column: error_rate
        aggregate: avg
    evalName:
      fixed: performance-metrics

required_evals:
  - name: performance-metrics
    rules:
      - metric: latency_p95
        operator: "<="
        baseline: fixed
        threshold: 200
        description: P95 latency must be under 200ms
      
      - metric: throughput
        operator: ">="
        baseline: fixed
        threshold: 1000
        description: Throughput must be at least 1000 req/s
      
      - metric: error_rate
        operator: "<="
        baseline: fixed
        threshold: 0.01
        description: Error rate must be under 1%

on_violation:
  action: block

Sample Data

performance-data.csv:
id,latency_ms,throughput,error_rate
1,150,1200,0.005
2,180,1150,0.008
3,195,1100,0.012
4,175,1250,0.003
5,160,1180,0.006

Running the Check

geval check --contract performance-gate.yaml --eval performance-data.csv

Expected Output

PASS:
✓ PASS

Contract:    performance-gate
Version:     1

All 1 eval(s) passed contract requirements
BLOCK:
✗ BLOCK

Contract:    performance-gate
Version:     1

Blocked: 1 violation(s) in 1 eval

Violations
  1. performance-metrics → error_rate
     error_rate = 0.015, expected <= 0.01

Advanced: Regression Detection

Compare against previous run to detect performance regressions:
required_evals:
  - name: performance-metrics
    rules:
      - metric: latency_p95
        operator: "<="
        baseline: previous
        max_delta: 0.1
        description: Latency should not increase more than 10%
geval check \
  --contract performance-gate.yaml \
  --eval current-performance.csv \
  --baseline previous-performance.json

CI/CD Integration

# .github/workflows/performance-check.yml
name: Performance Check

on: [pull_request]

jobs:
  performance-check:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      
      - name: Run performance tests
        run: npm run test:performance -- --output performance-data.csv
      
      - name: Install Geval
        run: npm install -g @geval-labs/cli
      
      - name: Check performance metrics
        run: |
          geval check \
            --contract performance-gate.yaml \
            --eval performance-data.csv