Testing & Mocking

API Schema Validation Testing: OpenAPI, JSON Schema, and Schemathesis

How to automatically validate API responses against schemas — OpenAPI response validation, JSON Schema assertion, Schemathesis property-based testing, and CI integration.

Why Schema Validation Testing

APIs drift from their documentation. A developer changes a response field name, adds a required field, or changes a status code — and the client breaks silently. Schema validation testing catches these breaks automatically, before they reach clients.

There are three distinct problems schema validation solves:

  • Response drift: Your API returns a shape that doesn't match the documented schema
  • Status code mismatch: A 200 response contains an error body, or vice versa
  • Undocumented edge cases: Input combinations that trigger 500s not mentioned in the spec

OpenAPI Response Validation

The most direct approach is to validate every response against your OpenAPI spec during tests.

openapi-core (Python)

# pip install openapi-core
from openapi_core import OpenAPI
from openapi_core.contrib.requests import RequestsOpenAPIRequest, RequestsOpenAPIResponse
import yaml
import requests

with open('openapi.yaml') as f:
    spec = yaml.safe_load(f)

openapi = OpenAPI.from_dict(spec)

def test_list_users_response_matches_schema():
    response = requests.get('http://localhost:8000/api/users')
    openapi_request = RequestsOpenAPIRequest(response.request)
    openapi_response = RequestsOpenAPIResponse(response)
    # Raises ValidationError if response doesn't match spec
    openapi.validate_response(openapi_request, openapi_response)
    assert response.status_code == 200

Dredd (language-agnostic)

Dredd reads an OpenAPI or API Blueprint spec and runs all documented examples as tests:

npm install -g dredd

# Run against a local server
dredd openapi.yaml http://localhost:8000

# In CI
dredd openapi.yaml http://localhost:8000 --reporter xunit > dredd-report.xml

Dredd tests every documented endpoint and status code in your spec. If your code returns a 200 but the spec says 201, or the body is missing a required field, the test fails.

Schemathesis

Schemathesis is a property-based testing tool for APIs. Instead of testing a fixed set of examples, it generates hundreds of input combinations from your OpenAPI spec and looks for any response that violates the schema — especially 500 errors.

pip install schemathesis

# Run against a local server
st run http://localhost:8000/openapi.yaml --validate-schema=true

# Run with authentication
st run http://localhost:8000/openapi.yaml \
  --header 'Authorization: Bearer test-token'

# Run with specific endpoint filter
st run http://localhost:8000/openapi.yaml --endpoint '/api/users'

Schemathesis in pytest

# pip install schemathesis
import schemathesis

schema = schemathesis.from_uri('http://localhost:8000/openapi.yaml')

@schema.parametrize()
def test_api(case):
    response = case.call()
    case.validate_response(response)

This generates test cases for every operation in the spec. Schemathesis checks:

  • Status code is documented in the spec
  • Response body matches the schema for that status code
  • No undocumented 5xx errors occur

Custom Checks

from schemathesis import check

@check
def no_500_errors(response, case):
    assert response.status_code != 500, (
        f'Endpoint returned 500: {response.text[:200]}'
    )

@check
def content_type_is_json(response, case):
    if response.status_code == 200:
        assert 'application/json' in response.headers.get('Content-Type', ''), (
            f'Expected JSON content type, got {response.headers.get("Content-Type")}'
        )

JSON Schema Assertion

For ad-hoc response validation in unit and integration tests:

# pip install jsonschema
from jsonschema import validate, ValidationError
import pytest

USER_SCHEMA = {
    'type': 'object',
    'required': ['id', 'name', 'email', 'created_at'],
    'properties': {
        'id': {'type': 'integer'},
        'name': {'type': 'string', 'minLength': 1},
        'email': {'type': 'string', 'format': 'email'},
        'created_at': {'type': 'string', 'format': 'date-time'},
        'role': {'type': 'string', 'enum': ['admin', 'viewer', 'editor']},
    },
    'additionalProperties': False,
}

def test_user_response_schema():
    response = client.get('/api/users/1')
    assert response.status_code == 200
    try:
        validate(instance=response.json(), schema=USER_SCHEMA)
    except ValidationError as e:
        pytest.fail(f'Response does not match schema: {e.message}')

JavaScript with Ajv

const Ajv = require('ajv');
const addFormats = require('ajv-formats');

const ajv = new Ajv();
addFormats(ajv);

const userSchema = {
  type: 'object',
  required: ['id', 'name', 'email'],
  properties: {
    id: { type: 'integer' },
    name: { type: 'string' },
    email: { type: 'string', format: 'email' },
  },
};

test('GET /users/:id matches schema', async () => {
  const res = await request(app).get('/users/1');
  expect(res.statusCode).toBe(200);
  const valid = ajv.validate(userSchema, res.body);
  expect(valid).toBe(true);
  if (!valid) throw new Error(ajv.errorsText());
});

CI Pipeline Integration

Run schema validation as a CI gate — fail the build if the API violates its own spec:

# .github/workflows/api-tests.yml
jobs:
  schema-validation:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Start API server
        run: docker compose up -d api
      - name: Wait for API
        run: curl --retry 10 --retry-delay 2 http://localhost:8000/health
      - name: Run Schemathesis
        run: |
          pip install schemathesis
          st run http://localhost:8000/openapi.yaml \
            --checks all \
            --stateful=links \
            --junit-xml=schemathesis-report.xml
      - name: Upload report
        uses: actions/upload-artifact@v4
        with:
          name: schemathesis-report
          path: schemathesis-report.xml

Spec Coverage

Track how much of your OpenAPI spec is exercised by tests:

# Schemathesis reports coverage per operation
st run openapi.yaml --report --report-dir ./coverage
# Look for operations with 'not_covered' status

Any operation with no test coverage is a blind spot. At minimum, every documented endpoint should have one test for each documented status code.

Related Protocols

Related Glossary Terms

More in Testing & Mocking