Practical API Testing with Pytest: Catch Bugs Before They Hit Production

Practical API Testing with Pytest: Catch Bugs Before They Hit Production

If you’re building or integrating with APIs, you need tests that prove more than “the endpoint returns 200.” Practical API testing checks the shape of responses, error handling, authentication, and edge cases—without turning into a fragile, slow suite. In this hands-on guide, you’ll build a clean API test setup using pytest, requests, and jsonschema. Everything here is designed for junior/mid developers who want tests that actually help during refactors.

We’ll assume you’re testing an existing REST API with endpoints like /health, /auth/login, and /projects. You can adapt the same structure to any backend.

What “Good” API Tests Cover

  • Smoke checks: the service is reachable and core endpoints respond.
  • Contract checks: responses match an expected schema (fields, types, required keys).
  • Auth flows: valid tokens work, missing/invalid tokens fail.
  • Negative cases: invalid input returns the right status and error format.
  • Behavior guarantees: idempotency (PUT), pagination shape, sorting rules, etc.

The trick: keep tests stable by asserting behavior and contracts, not incidental details (like exact error messages that change often).

Project Setup

Create a small test-only Python project (or add to your existing repo):

pip install pytest requests jsonschema python-dotenv

Recommended structure:

api-tests/ tests/ conftest.py test_health.py test_auth.py test_projects.py schemas.py .env.example pytest.ini

Put configuration in environment variables so your tests can run locally, in staging, or against a dev environment.

# .env.example API_BASE_URL=https://api.example.com [email protected] API_PASSWORD=dev-password

Now add a minimal pytest.ini:

[pytest] addopts = -q testpaths = tests

A Reusable API Client + Test Fixtures

Use conftest.py to define shared fixtures. This keeps your tests short and readable.

# tests/conftest.py import os import requests import pytest from dotenv import load_dotenv load_dotenv() def _require_env(name: str) -> str: value = os.getenv(name) if not value: raise RuntimeError(f"Missing required env var: {name}") return value @pytest.fixture(scope="session") def base_url() -> str: return _require_env("API_BASE_URL").rstrip("/") @pytest.fixture(scope="session") def session() -> requests.Session: s = requests.Session() # Good defaults for API testing s.headers.update({ "Accept": "application/json", "User-Agent": "api-tests/1.0" }) return s @pytest.fixture(scope="session") def auth_token(base_url: str, session: requests.Session) -> str: email = _require_env("API_EMAIL") password = _require_env("API_PASSWORD") resp = session.post( f"{base_url}/auth/login", json={"email": email, "password": password}, timeout=10, ) assert resp.status_code == 200, resp.text data = resp.json() # Adjust key names based on your API assert "access_token" in data return data["access_token"] @pytest.fixture() def auth_headers(auth_token: str) -> dict: return {"Authorization": f"Bearer {auth_token}"}

Notes:

  • requests.Session reuses connections, speeding up test runs.
  • Use timeouts so a hung environment fails quickly.
  • Keep login in a session-scoped fixture so you don’t authenticate for every test.

Smoke Test: Health Endpoint

Start with one fast smoke test that tells you the API is reachable. This is also useful in deployment checks.

# tests/test_health.py def test_health(base_url, session): resp = session.get(f"{base_url}/health", timeout=10) assert resp.status_code == 200 data = resp.json() assert data.get("status") in {"ok", "healthy"}

This is intentionally simple. If this fails, the rest of the suite probably will too.

Contract Testing with JSON Schema

Contract tests ensure responses have the fields and types your frontend/integrations rely on. You don’t need full OpenAPI tooling to get value—JSON Schema is enough for most teams.

Create a file with reusable schemas:

# tests/schemas.py PROJECT_SCHEMA = { "type": "object", "required": ["id", "name", "created_at"], "properties": { "id": {"type": "integer"}, "name": {"type": "string", "minLength": 1}, "description": {"type": ["string", "null"]}, "created_at": {"type": "string"}, }, "additionalProperties": True } PAGINATED_PROJECTS_SCHEMA = { "type": "object", "required": ["items", "page", "page_size", "total"], "properties": { "items": {"type": "array", "items": PROJECT_SCHEMA}, "page": {"type": "integer", "minimum": 1}, "page_size": {"type": "integer", "minimum": 1}, "total": {"type": "integer", "minimum": 0}, }, "additionalProperties": True } ERROR_SCHEMA = { "type": "object", "required": ["error"], "properties": { "error": { "type": "object", "required": ["code", "message"], "properties": { "code": {"type": "string"}, "message": {"type": "string"}, "details": {"type": ["object", "array", "null"]}, }, "additionalProperties": True } }, "additionalProperties": True }

Then validate responses in tests:

# tests/test_projects.py from jsonschema import validate from tests.schemas import PAGINATED_PROJECTS_SCHEMA, PROJECT_SCHEMA def test_list_projects_contract(base_url, session, auth_headers): resp = session.get( f"{base_url}/projects?page=1&page_size=10", headers=auth_headers, timeout=10, ) assert resp.status_code == 200, resp.text data = resp.json() validate(instance=data, schema=PAGINATED_PROJECTS_SCHEMA) def test_get_project_contract(base_url, session, auth_headers): # Pick an ID that exists in your test environment. resp = session.get(f"{base_url}/projects/1", headers=auth_headers, timeout=10) assert resp.status_code == 200, resp.text validate(instance=resp.json(), schema=PROJECT_SCHEMA)

If someone accidentally renames created_at to createdAt or changes id from integer to string, these tests fail immediately—with a clear validation error.

Auth Tests: Prove the API is Protected

It’s surprisingly common to accidentally ship an endpoint without auth middleware. Add a test that ensures protection exists, and another that ensures valid auth works.

# tests/test_auth.py from jsonschema import validate from tests.schemas import ERROR_SCHEMA def test_projects_requires_auth(base_url, session): resp = session.get(f"{base_url}/projects", timeout=10) assert resp.status_code in {401, 403} # If your API returns JSON errors, validate them if resp.headers.get("Content-Type", "").startswith("application/json"): validate(instance=resp.json(), schema=ERROR_SCHEMA) def test_projects_accepts_auth(base_url, session, auth_headers): resp = session.get(f"{base_url}/projects", headers=auth_headers, timeout=10) assert resp.status_code == 200

This pair prevents “oops we opened the data endpoint to the world” mistakes.

Negative Tests: Validate Error Handling, Not Just Success

Negative tests confirm your API fails in predictable, client-friendly ways. The key is to assert status_code and the presence of a stable error structure.

# tests/test_projects.py from jsonschema import validate from tests.schemas import ERROR_SCHEMA def test_create_project_missing_name(base_url, session, auth_headers): resp = session.post( f"{base_url}/projects", headers=auth_headers, json={"description": "missing name"}, timeout=10, ) assert resp.status_code == 400 validate(instance=resp.json(), schema=ERROR_SCHEMA)

If you’re building the API, these tests also guide your error format. Consistent errors make frontend work faster and reduce guesswork.

Behavior Tests: Idempotency and Pagination Sanity

Contracts are great, but behavior tests catch logic bugs. Two practical examples:

  • Pagination doesn’t change shape (even when empty).
  • PUT is idempotent (sending the same payload twice results in the same resource state).
# tests/test_projects.py from jsonschema import validate from tests.schemas import PAGINATED_PROJECTS_SCHEMA, PROJECT_SCHEMA def test_pagination_empty_page_is_still_valid(base_url, session, auth_headers): resp = session.get( f"{base_url}/projects?page=9999&page_size=10", headers=auth_headers, timeout=10, ) assert resp.status_code == 200 data = resp.json() validate(instance=data, schema=PAGINATED_PROJECTS_SCHEMA) assert isinstance(data["items"], list) def test_put_is_idempotent(base_url, session, auth_headers): payload = {"name": "My Project", "description": "Stable payload"} r1 = session.put(f"{base_url}/projects/1", headers=auth_headers, json=payload, timeout=10) assert r1.status_code in {200, 204}, r1.text r2 = session.put(f"{base_url}/projects/1", headers=auth_headers, json=payload, timeout=10) assert r2.status_code in {200, 204}, r2.text # If the API returns the updated resource, validate it if r2.status_code == 200: validate(instance=r2.json(), schema=PROJECT_SCHEMA)

This kind of test is simple, but it catches “PUT actually creates a new record” or “empty pages return null instead of []” issues that cause real client bugs.

Make Tests Stable: Practical Rules

  • Prefer seeded test data in a dedicated environment (IDs and resources you can rely on).
  • Avoid asserting exact timestamps or full error strings—assert keys and types instead.
  • Use timeouts and keep the suite fast; slow tests get skipped.
  • Separate smoke vs. full suite (e.g., mark long tests with @pytest.mark.slow).

If your API is not stable enough to have fixed IDs like /projects/1, create resources inside the test and clean them up. That costs a bit more runtime, but improves isolation.

Run the Suite

Set env vars and run:

# copy .env.example to .env and fill values pytest

If you want quick feedback during development, run a single file:

pytest tests/test_projects.py

Next Steps

Once you have the basics, the highest-value upgrades are:

  • Add OpenAPI-based validation if your API publishes a spec (auto-validate endpoints against the spec).
  • Add property-based “fuzz” tests for input validation (generate many invalid payloads).
  • Measure performance regressions with a small set of timing assertions on critical endpoints.

But even without those upgrades, a small, well-structured pytest API suite like the one above will catch breaking changes early—and make you far more confident when shipping backend changes.


Leave a Reply

Your email address will not be published. Required fields are marked *