How to Build a Mock Server for Government APIs to Speed Up Development and Testing

Public-sector APIs are often mission-critical — and notoriously slow to access in pre‑production. Vendor queues, restricted sandboxes, and strict change controls can stall delivery for weeks. A well‑designed mock server removes those bottlenecks, giving your teams realistic endpoints, rigorous data validation, and controllable failure scenarios from day one — without waiting on a government sandbox.

This guide shows how to design and implement a production‑grade mock server tailored to EU GovTech needs, covering validation, compliance, tooling, and CI integration.

Key outcomes:

  • Faster time‑to‑first‑integration — days instead of weeks
  • Fewer late‑stage surprises — contract issues found in CI
  • Repeatable, auditable tests — aligned with GDPR, DORA, and eIDAS expectations

What Is a Mock Server — And Why It Matters in GovTech

A mock server simulates a real API by returning predefined or dynamically generated responses for requests that match an agreed contract (OpenAPI/JSON Schema). In GovTech, it’s not just a convenience — it is a strategic control that:

  • Enforces schema compliance early (preventing expensive rework later)
  • Simulates authentication and authorization behaviors safely
  • Recreates edge cases (timeouts, malformed payloads, rate limits)
  • Decouples product teams from slow third‑party release cycles

When to Use a Mock Server

  • Early product discovery — validate flows without real credentials
  • Local development — productive work without VPNs and PKI hurdles
  • Continuous integration — contract tests in every PR
  • System integration testing — predictable data and scenarios
  • UAT and demo environments — stable, explainable behaviors for stakeholders
  • Incident drills — simulate upstream failures for resilience testing

GovTech‑Grade Requirements

  • Contract fidelity — OpenAPI 3.0+ with JSON Schema validation on requests and responses
  • Authentication simulation — OAuth 2.0 / OIDC, mTLS, and eIDAS QWAC/QSealC flows emulated without real secrets
  • Reference data and code lists — strict validation against official taxonomies (NACE, ISO country codes, fiscal codes, Peppol code lists)
  • Stateful flows — reproduce multi‑step processes (submission — polling — result)
  • Idempotency and replay — stable behavior on retries with tokens/headers
  • Rate limits and quotas — match production policies and headers
  • Error injection — timeouts, 4xx/5xx, partial failures, invalid signatures
  • Observability — correlation IDs, structured logs, trace headers for audits
  • Data protection — synthetic or fully anonymised fixtures; clear retention policy; no production data
  • Versioning and deprecation — mirror upstream lifecycle, with parity checks in CI

Architecture Patterns

  • Specification‑first — treat OpenAPI as the single source of truth
  • Validation in the loop — reject non‑conformant payloads early
  • Data fixture service — curated synthetic datasets and code lists
  • Dynamic templating — responses using request‑aware templates (e.g., Handlebars/JS expressions)
  • Stateful simulators — in‑memory or lightweight storage to model workflows
  • Contract testing — consumer‑driven contracts (Pact) and API contract conformance (Dredd/Schemathesis)
  • CI gates — fail builds on spec drift or schema violations
  • Parity monitor — nightly comparison of upstream spec vs mock copy

Tooling — What to Use When

Tool Type Best For Validation Stateful Notes
Prism by Stoplight OpenAPI mock/validator Spec‑first mocks with strict validation Yes Limited Great for fast start; strong validation
WireMock HTTP stub server Complex stateful scenarios, fault injection Partial (via schemas) Yes JVM; mature ecosystem, extensions
MockServer HTTP/SOCKS mock/proxy Advanced matching and expectations Partial Yes Strong for service‑level testing
Postman Mock Server Hosted mock Quick team demos and collections Limited No Easy to share; weaker validation
MSW (frontend) Client‑side mocking UI dev without backend N/A Local Pairs with server mocks
JSON Server / Mirage Rapid prototyping Quick CRUD prototypes No Some Not suitable for strict compliance
Pact Contract testing Consumer‑driven contracts in CI N/A N/A Catches breaking changes early
Dredd / Schemathesis Spec conformance testing Verify requests/responses against spec Yes N/A Automate as CI gate

Step‑by‑Step Implementation

1) Define the contract

  • Collect the producer’s OpenAPI and JSON Schemas; if unavailable, draft your own from docs and refine via reviews.
  • Encode code lists and reference data as JSON and link them in schemas using enum or $ref.

2) Spin up a validating mock

  • Prism quick start:
    • Install: npm install -g @stoplight/prism-cli
    • Run: prism mock api.yaml —errors —validate-request —validate-response
  • Or choose WireMock/MockServer if you need richer state and fault injection.

3) Add synthetic datasets

  • Create fixtures for typical and edge cases (valid, borderline, invalid).
  • Ensure GDPR compliance: synthetic by design, no production data; define retention and access controls.

4) Simulate auth and policy controls

  • Emulate OAuth tokens (signed but test‑only), mTLS verification shortcuts, eIDAS certificate checks as deterministic rules.
  • Mirror headers: rate limit, correlation IDs, idempotency keys.

5) Implement stateful flows

  • Use in‑memory stores or lightweight DB to track submissions and polling.
  • Provide deterministic transitions (e.g., submitted — processing — accepted/rejected) based on test scenario flags.

6) Inject realistic failures

  • Toggle faults via headers or query params: x-simulate: timeout|5xx|invalid_signature|rate_limited.
  • Randomize within safe bounds to catch flaky assumptions without breaking reproducibility.

7) Wire into CI/CD

  • Run the mock as a service in test pipelines (Docker).
  • Add gates:
    • Contract tests (Pact)
    • Spec conformance (Dredd/Schemathesis)
    • Schema generation diff (fail on breaking changes)

8) Govern changes

  • Track mock and spec versions; maintain a deprecation calendar aligned to upstream.
  • Publish a changelog and “what’s new” notes for integrators.

Minimal Example — Node.js with OpenAPI Validation (AJV)

 

  • Package.json scripts

{
“scripts”: {
“start:mock”: “node mock/index.js”,
“test:contract”: “dredd api.yaml http://localhost:8080”
}
}

  • mock/index.js (Express + Ajv)

import express from ‘express’;
import Ajv from ‘ajv’;
import addFormats from ‘ajv-formats’;
import api from ‘./api.schemas.json’ assert { type: ‘json’ }; // compiled JSON Schemas

const app = express();
app.use(express.json());

const ajv = new Ajv({ allErrors: true, strict: true });
addFormats(ajv);

const validators = {
submitRequest: ajv.compile(api.components.schemas.SubmitRequest),
submitResponse: ajv.compile(api.components.schemas.SubmitResponse)
};

app.post(‘/v1/submissions’, (req, res) => {
const validReq = validators.submitRequest(req.body);
if (!validReq) {
return res.status(400).json({ error: ‘INVALID_REQUEST’, details: validators.submitRequest.errors });
}

const scenario = req.header(‘x-simulate’) || ‘ok’;
if (scenario === ‘timeout’) return; // let client hit a timeout
if (scenario === ‘5xx’) return res.status(503).json({ error: ‘UPSTREAM_UNAVAILABLE’ });

const response = { id: ‘SUB-123’, status: ‘accepted’, receivedAt: new Date().toISOString() };

const validRes = validators.submitResponse(response);
if (!validRes) {
return res.status(500).json({ error: ‘INVALID_RESPONSE’, details: validators.submitResponse.errors });
}

res.set(‘x-correlation-id’, req.header(‘x-correlation-id’) || ‘corr-‘ + Date.now());
return res.status(201).json(response);
});

app.listen(8080, () => console.log(‘Mock server on :8080’));

  • Notes
    • Compile your OpenAPI to JSON Schemas (e.g., openapi‑to‑json‑schema) and store in api.schemas.json.
    • Add code lists via enums in schemas for strict validation.

Minimal Example — FastAPI for Stateful Polling

from fastapi import FastAPI, Header, HTTPException
from pydantic import BaseModel, Field
from typing import Dict
import time

app = FastAPI()
store: Dict[str, str] = {}

class SubmitRequest(BaseModel):
taxpayerId: str = Field(…, pattern=r”^[A-Z0-9]{8,16}$”)
amount: float = Field(…, ge=0.0)

class SubmitResponse(BaseModel):
id: str
status: str

@app.post(“/v1/submissions”, response_model=SubmitResponse, status_code=201)
def submit(req: SubmitRequest, x_simulate: str | None = Header(default=None)):
if x_simulate == “5xx”:
raise HTTPException(status_code=503, detail=”UPSTREAM_UNAVAILABLE”)
sub_id = “SUB-” + str(int(time.time()))
store[sub_id] = “processing”
return SubmitResponse(id=sub_id, status=”received”)

@app.get(“/v1/submissions/{sub_id}”, response_model=SubmitResponse)
def poll(sub_id: str):
status = store.get(sub_id, None)
if not status:
raise HTTPException(status_code=404, detail=”NOT_FOUND”)
# deterministic state progression for demo
store[sub_id] = “accepted” if status == “processing” else status
return SubmitResponse(id=sub_id, status=store[sub_id])

Testing Strategy — What to Automate

  • Contract conformance
    • Dredd against OpenAPI
    • Schemathesis for property‑based fuzzing
  • Consumer‑driven contracts
    • Pact to ensure producers don’t break consumers
  • Validation coverage
    • Required fields, enums, formats, pattern checks, nested objects
  • Negative paths and faults
    • Invalid tokens, expired certificates, mismatched signatures, rate limits, timeouts
  • Performance smoke
    • Baseline latency, concurrency, head‑of‑line behaviors
  • Security hygiene
    • No secrets in configs, deterministic synthetic data, access logs reviewed

Compliance and Governance — EU Expectations

  • GDPR and data minimisation — use synthetic datasets by default; define retention windows for logs and fixtures
  • DORA resilience — rehearse upstream failures and recovery paths; log and trace test drills
  • eIDAS 2.0 alignment — emulate certificate‑based flows, but segregate test keys; document how real QWAC/QSealC differ in production
  • Auditability — structured logs, correlation IDs, and traceability of test scenarios
  • Change control — version locks, deprecation schedules, and formal sign‑off of breaking changes

KPIs to Prove ROI

  • Time to first successful end‑to‑end flow — target < 3 days
  • Contract defect rate discovered in CI — trending upward initially, then down over time
  • Post‑UAT integration defects — target near‑zero
  • Mock parity drift incidents — track and remediate in < 48 hours
  • Test coverage of critical endpoints — target ≥ 80% by scenario count

Common Pitfalls — And How to Avoid Them

  • Happy‑path only mocks — add negative and edge scenarios from the start
  • No schema validation — wire in JSON Schema validation for both requests and responses
  • Stale contracts — nightly spec diff and CI gates
  • Using production‑like secrets — never; use deterministic, clearly fake credentials
  • Ignoring code lists — enforce enums and reference datasets to mirror reality
  • Hard‑coded delays — implement configurable, bounded latency and failure rates

Rollout Checklist

1) OpenAPI 3.0+ and JSON Schemas finalised and reviewed 2) Validating mock server running locally and in CI 3) Synthetic datasets and code lists published and versioned 4) Stateful flows and error injection implemented 5) Auth simulations documented (OAuth, mTLS, eIDAS) 6) Contract tests (Pact) and conformance tests (Dredd/Schemathesis) green 7) Observability — correlation IDs, structured logs, rate‑limit headers 8) Governance — versioning, changelog, deprecation policy, access controls

FAQ

  • Is a mock server the same as a sandbox? — No. A sandbox is usually hosted by the provider; a mock is under your control and can simulate more scenarios, faster.
  • Should we simulate OAuth and mTLS? — Yes, but with clearly test‑only keys and deterministic validation rules.
  • How do we keep mocks current? — Automate spec diffing, run contract tests in CI, and align mock versioning with upstream releases.
  • Can we use real data? — Prefer synthetic data. If pseudonymised, document the DPIA, scope, and retention.

Summary

A GovTech‑grade mock server is a strategic accelerator — not a developer toy. Make the OpenAPI the source of truth, validate everything with JSON Schemas, simulate real‑world auth and state, and wire contract tests into CI. With parity checks, synthetic data, and strong observability, you will cut lead time, reduce rework, and meet EU compliance expectations — while shipping faster and with confidence.