API Testing in Practice: A Minimal, Reliable Setup with Pytest + HTTPX + JSON Schema
When an API breaks, it usually breaks in boring ways: a field disappears, a status code changes, validation gets loosened, or an “optional” key suddenly becomes required. The goal of practical API testing is to catch those changes early—with tests that are easy to write, fast to run, and hard to flake.
In this hands-on guide, you’ll build a small but production-friendly API test harness using:
pytestfor test running and fixtureshttpxfor clean HTTP calls (sync or async)jsonschemato validate response shapes (contracts)- A few patterns for auth, environment config, and idempotent test data
The examples are written for junior/mid developers and will work against any REST API you can hit in a test environment.
1) Install Dependencies and Create a Test Layout
Start with a clean test folder structure. This keeps your test suite readable as it grows.
pip install pytest httpx jsonschema python-dotenv
Create this layout:
project/ tests/ conftest.py test_health.py test_users.py schemas/ user.json .env pytest.ini
In .env, put your test environment configuration:
API_BASE_URL=https://api.example.test API_TOKEN=replace-me-with-a-test-token
And a simple pytest.ini:
[pytest] testpaths = tests addopts = -q
2) Build a Reusable HTTP Client Fixture
A common mistake is repeating base URLs, headers, and timeouts in every test. Instead, centralize them in conftest.py.
# tests/conftest.py import os import pytest import httpx from dotenv import load_dotenv load_dotenv() @pytest.fixture(scope="session") def base_url() -> str: url = os.getenv("API_BASE_URL", "").strip() if not url: raise RuntimeError("Missing API_BASE_URL in environment or .env") return url.rstrip("/") @pytest.fixture(scope="session") def api_token() -> str: token = os.getenv("API_TOKEN", "").strip() if not token: raise RuntimeError("Missing API_TOKEN in environment or .env") return token @pytest.fixture(scope="session") def client(base_url: str, api_token: str) -> httpx.Client: headers = { "Authorization": f"Bearer {api_token}", "Accept": "application/json", "Content-Type": "application/json", } with httpx.Client( base_url=base_url, headers=headers, timeout=httpx.Timeout(10.0), follow_redirects=True, ) as c: yield c
This gives you a consistent, fast client with a timeout (timeouts prevent “hung builds”).
3) Start with a Smoke Test (Health + Basic Expectations)
Smoke tests are your first line of defense. They should be quick and check only the essentials.
# tests/test_health.py def test_health(client): r = client.get("/health") assert r.status_code == 200 data = r.json() assert data["status"] in ("ok", "healthy") assert "version" in data
If your API doesn’t have /health, use / or a simple “ping” endpoint. The point is to confirm the service is reachable and returning valid JSON.
4) Add Contract Tests with JSON Schema
Most breakages are contract breakages: the API still returns 200, but the body changed. JSON Schema validation catches this neatly.
Create a schema file for a user response:
# tests/schemas/user.json { "$schema": "https://json-schema.org/draft/2020-12/schema", "type": "object", "required": ["id", "email", "created_at"], "properties": { "id": { "type": "string" }, "email": { "type": "string", "format": "email" }, "name": { "type": ["string", "null"] }, "created_at": { "type": "string" } }, "additionalProperties": true }
Now validate an endpoint response against it:
# tests/test_users.py import json from pathlib import Path from jsonschema import validate USER_SCHEMA = json.loads( Path(__file__).with_name("schemas").joinpath("user.json").read_text() ) def test_get_current_user_contract(client): r = client.get("/me") assert r.status_code == 200 data = r.json() validate(instance=data, schema=USER_SCHEMA)
This is a simple contract test: your endpoint can evolve, but it must keep required keys and types stable.
- If you want stricter contracts, set
"additionalProperties": false. - If your API returns numeric IDs, change
"id"to{"type":"integer"}.
5) Test Create/Read/Update/Delete Without Flaky Data
CRUD tests often fail because they depend on shared state. You can make them reliable by generating unique data and cleaning up after yourself.
Here’s a pattern using a helper function and try/finally cleanup:
# tests/test_users.py import uuid from jsonschema import validate def unique_email() -> str: return f"test-{uuid.uuid4().hex[:10]}@example.com" def test_user_lifecycle(client): # 1) Create payload = {"email": unique_email(), "name": "API Test User"} create = client.post("/users", json=payload) assert create.status_code == 201 user = create.json() user_id = user["id"] try: # 2) Read get_r = client.get(f"/users/{user_id}") assert get_r.status_code == 200 validate(instance=get_r.json(), schema=USER_SCHEMA) # 3) Update patch_r = client.patch(f"/users/{user_id}", json={"name": "Updated Name"}) assert patch_r.status_code in (200, 204) # Confirm update (some APIs return 204 No Content) get2 = client.get(f"/users/{user_id}") assert get2.status_code == 200 assert get2.json().get("name") == "Updated Name" finally: # 4) Delete (cleanup) del_r = client.delete(f"/users/{user_id}") assert del_r.status_code in (200, 204, 404)
Why allow 404 on delete? In some environments (or retries), the user might already be removed. Allowing 404 keeps cleanup idempotent.
6) Assert Errors Like a Pro (Validation + Auth)
Good API tests don’t only test happy paths. They also verify that failures are consistent and helpful.
Example: creating a user with an invalid email should return a predictable status code and error shape.
def test_create_user_invalid_email(client): r = client.post("/users", json={"email": "not-an-email", "name": "Bad"}) assert r.status_code in (400, 422) data = r.json() # Adapt this to your API's error format assert "error" in data or "errors" in data
Example: missing auth should be 401 (or 403, depending on your system).
import httpx def test_requires_auth(base_url): # No auth headers on purpose with httpx.Client(base_url=base_url, timeout=10.0) as c: r = c.get("/me") assert r.status_code in (401, 403)
7) Add “Golden” Performance Checks (Lightweight)
You can get a lot of value by asserting that key endpoints don’t exceed a soft threshold. Keep it generous to avoid flaky tests in CI, and only apply it to stable endpoints.
import time def test_list_users_is_fast_enough(client): start = time.perf_counter() r = client.get("/users?limit=20") elapsed_ms = (time.perf_counter() - start) * 1000 assert r.status_code == 200 assert elapsed_ms < 800 # adjust for your environment
This isn’t a load test—but it will catch accidental N+1 queries or missing indexes that suddenly make an endpoint 10x slower.
8) Run Tests Locally and in CI
Run locally:
pytest
Run a single test file while you iterate:
pytest tests/test_users.py -q
Tip: keep your tests environment-agnostic. Your CI should be able to point to a staging API by setting API_BASE_URL and API_TOKEN as environment variables (instead of committing secrets).
Practical Checklist for a Maintainable API Test Suite
- Centralize config (base URL, auth headers, timeouts) in fixtures.
- Validate contracts using JSON Schema, not just status codes.
- Generate unique data (UUID-based emails, names) to avoid collisions.
- Cleanup reliably with
try/finallyand idempotent deletes. - Test errors (invalid payloads, missing auth) so failures stay consistent.
- Add a few speed assertions to catch major regressions early.
If you apply only these patterns, you’ll end up with API tests that are: fast enough for every PR, strict enough to catch breaking changes, and simple enough that the whole team can contribute.
Leave a Reply