Practical API Testing with pytest + httpx: Ship Safer Endpoints Without a Massive Framework
API bugs are expensive because they hide behind “it worked on my machine” assumptions: a missing field, a slightly different status code, a pagination edge case, or a breaking change deployed quietly. The good news: you don’t need a huge test suite to get real value. A small, consistent set of API tests—written like a client would use your API—catches most regressions early.
This article shows a hands-on approach to testing REST APIs using pytest and httpx (a modern HTTP client). You’ll build:
- a clean test structure
- fixtures for auth and base URL
- tests for status codes, payload shape, and error responses
- schema validation with
jsonschema - a tiny pattern for testing pagination and idempotency
1) Setup: dependencies and project layout
Install the tools:
pip install pytest httpx python-dotenv jsonschema
Suggested layout:
your-project/ tests/ conftest.py test_users.py test_auth.py schemas.py .env pytest.ini
Create a .env file to avoid hardcoding secrets:
API_BASE_URL=https://api.example.com API_TOKEN=replace-me
Add a minimal pytest.ini:
[pytest] addopts = -q testpaths = tests
2) A simple HTTP client fixture (and why it matters)
Fixtures reduce copy/paste and make tests consistent. In tests/conftest.py, create a client that automatically uses your base URL and auth headers.
import os import pytest import httpx from dotenv import load_dotenv load_dotenv() @pytest.fixture(scope="session") def base_url() -> str: url = os.getenv("API_BASE_URL") if not url: raise RuntimeError("API_BASE_URL is not set") return url.rstrip("/") @pytest.fixture(scope="session") def api_token() -> str: token = os.getenv("API_TOKEN") if not token: raise RuntimeError("API_TOKEN is not set") return token @pytest.fixture def client(base_url: str, api_token: str): headers = { "Authorization": f"Bearer {api_token}", "Accept": "application/json", } with httpx.Client(base_url=base_url, headers=headers, timeout=10.0) as c: yield c
Why this is useful:
- All tests use the same timeout and headers.
- You can switch environments by changing
.env. - If your API requires custom headers (tenant ID, API version), add them once.
3) Test the “contract”: status codes + shape + key fields
Many API failures aren’t “server is down”—they’re “response structure changed.” To catch that, assert:
- HTTP status code
- content type
- required keys and basic types
Example: testing GET /users/{id} in tests/test_users.py.
import uuid def test_get_user_by_id_returns_expected_shape(client): user_id = "123" # use a known stable fixture user in your staging env r = client.get(f"/users/{user_id}") assert r.status_code == 200 assert r.headers["content-type"].startswith("application/json") data = r.json() # Contract checks (shape + a few important constraints) assert "id" in data assert "email" in data assert "createdAt" in data assert str(data["id"]) == user_id assert "@" in data["email"] # If you use UUIDs: # uuid.UUID(data["id"]) # raises if invalid
Tip: Don’t over-assert every field unless you truly need it. Focus on what clients rely on.
4) Validate JSON responses with a schema (without reinventing typing)
Schema validation is the sweet spot between “too loose” and “too strict.” You can define a small JSON Schema for critical endpoints and validate responses with jsonschema.
Create tests/schemas.py:
USER_SCHEMA = { "type": "object", "required": ["id", "email", "createdAt"], "properties": { "id": {"type": ["string", "integer"]}, "email": {"type": "string", "minLength": 3}, "createdAt": {"type": "string"}, "name": {"type": "string"}, }, "additionalProperties": True, }
Use it in a test:
from jsonschema import validate from tests.schemas import USER_SCHEMA def test_get_user_schema(client): user_id = "123" r = client.get(f"/users/{user_id}") assert r.status_code == 200 data = r.json() validate(instance=data, schema=USER_SCHEMA)
This catches accidental breaking changes like renaming createdAt to created_at or returning email as null.
5) Test error responses like they’re first-class features
Clients depend on consistent error handling. If your API returns structured errors (recommended), test them. Example: POST /users with invalid payload.
def test_create_user_validation_error(client): payload = {"email": "not-an-email"} # missing required fields, wrong format, etc. r = client.post("/users", json=payload) assert r.status_code in (400, 422) data = r.json() # Adjust these keys to match your API’s error format assert "error" in data or "errors" in data # Example patterns: # {"error": {"code": "VALIDATION_ERROR", "message": "...", "details": [...]}} # {"errors": [{"field": "name", "message": "Required"}]}
Why this matters: if your frontend or integrations parse error codes, a “minor” change can break user flows.
6) Auth tests: verify both “locked” and “allowed” paths
Don’t only test successful authorized requests—also test the unauthorized case to ensure endpoints are protected and return predictable status codes.
import httpx import os from dotenv import load_dotenv load_dotenv() def test_protected_endpoint_requires_auth(): base_url = os.getenv("API_BASE_URL").rstrip("/") with httpx.Client(base_url=base_url, timeout=10.0) as c: r = c.get("/users/me") assert r.status_code in (401, 403)
If your API uses role-based permissions, consider testing one “forbidden” scenario too (valid token, wrong role).
7) Pagination tests: catch off-by-one and “next page” bugs
Pagination bugs are common: wrong total, repeating items, incorrect cursor/offset handling. A simple test can protect your list endpoints.
Example for offset pagination: GET /users?limit=5&offset=0.
def test_users_pagination_offset(client): r1 = client.get("/users", params={"limit": 5, "offset": 0}) assert r1.status_code == 200 page1 = r1.json() r2 = client.get("/users", params={"limit": 5, "offset": 5}) assert r2.status_code == 200 page2 = r2.json() # Adjust to your response format: might be {"items": [...], "total": 123} items1 = page1.get("items", page1) items2 = page2.get("items", page2) assert len(items1) <= 5 assert len(items2) <= 5 # Ensure no overlap (basic sanity check) ids1 = {str(u["id"]) for u in items1} ids2 = {str(u["id"]) for u in items2} assert ids1.isdisjoint(ids2)
If you use cursor pagination, test that nextCursor actually advances and that items don’t repeat.
8) Idempotency tests: prevent accidental double-creates
Payment, order, and “create” endpoints often need idempotency keys. If your API supports it, test it. The goal: sending the same request twice with the same key should not create duplicates.
def test_create_order_idempotency(client): headers = {"Idempotency-Key": "test-key-123"} payload = {"productId": "abc", "quantity": 1} r1 = client.post("/orders", json=payload, headers=headers) assert r1.status_code in (200, 201) r2 = client.post("/orders", json=payload, headers=headers) # Many APIs return 200 with the original resource, or 201 with same ID assert r2.status_code in (200, 201) order1 = r1.json() order2 = r2.json() assert str(order1["id"]) == str(order2["id"])
This test is a lifesaver when retries happen in the real world (mobile networks, queue replays, client timeouts).
9) Practical tips to keep tests stable
-
Use known test data. Keep a stable “fixture user” in staging, or create-and-cleanup resources in the test itself.
-
Prefer asserting invariants. Don’t assert exact timestamps or full lists unless necessary.
-
Separate smoke vs. deep tests. A small smoke suite can run often; deeper suites can run less frequently.
-
Make failures readable. When asserting complex responses, print
r.textin assertion messages if helpful.
10) Running the tests
pytest
If you want to point tests at another environment (like staging), update .env or export variables:
export API_BASE_URL="https://staging.api.example.com" export API_TOKEN="..." pytest
Wrap-up
A practical API test suite doesn’t need to be huge. Start with 10–20 tests that cover:
- core happy paths
- schema/contract checks for key endpoints
- validation + error format
- auth protection
- pagination and idempotency where relevant
Once these are in place, you’ll catch breaking changes quickly and confidently evolve your API without constantly fearing regressions.
Leave a Reply