Cloudflare CDN in Practice: Cache Smarter, Speed Up APIs, and Add Edge Logic with Workers

Cloudflare CDN in Practice: Cache Smarter, Speed Up APIs, and Add Edge Logic with Workers

A CDN isn’t just “put Cloudflare in front of your site and hope it’s faster.” The real wins come from deliberately caching the right things, setting correct headers, and (optionally) moving tiny pieces of logic to the edge. This guide is hands-on: you’ll configure cache behavior, verify it’s working, and add a Cloudflare Worker to cache an API response safely.

We’ll focus on three practical outcomes:

  • Cache static assets correctly (and aggressively) without breaking releases
  • Cache HTML/pages carefully (or not at all) depending on your app
  • Use a Worker to cache a slow API endpoint with a safe key + TTL

1) Quick mental model: what Cloudflare caches (and when)

Cloudflare sits between your users and your origin (your server). For each request, it decides whether to serve from cache or forward to origin. The decision depends on:

  • Cache-Control / Expires headers returned by your origin
  • Cloudflare “Cache Rules” and settings (e.g., “Cache Everything”)
  • The request method (usually only GET/HEAD are cached)
  • The cache key (URL + query string + sometimes headers/cookies, depending on config)

Your job is to make responses cacheable when they should be, and uncacheable when they must be.

2) Cache static assets aggressively (the safest, biggest win)

Static assets—hashed JS/CSS bundles, images, fonts—are perfect for long caching because they change when the filename changes. The trick is: use content hashes in filenames (most bundlers do this already) and set very long TTLs.

Recommended headers for hashed assets:

Cache-Control: public, max-age=31536000, immutable 

Here are examples for popular origins. Pick the one you use.

2.1 Nginx example

server { listen 80; server_name example.com;
root /var/www/app/public;
Hashed assets: /assets/app.3f2c1a9.js
location ~* ^/(assets|static)/.*.(js|css|png|jpg|jpeg|gif|svg|webp|woff2?)$ {
add_header Cache-Control "public, max-age=31536000, immutable";
try_files $uri =404;
}
Default: don’t assume HTML is cacheable
location / {
add_header Cache-Control "no-store";
try_files $uri /index.html;
}
}

2.2 Express (Node.js) example

import express from "express"; import path from "path";
const app = express();
app.use(
"/assets",
express.static(path.join(process.cwd(), "public/assets"), {
setHeaders(res) {
res.setHeader("Cache-Control", "public, max-age=31536000, immutable");
},
})
);
app.get("/", (req, res) => {
res.setHeader("Cache-Control", "no-store");
res.sendFile(path.join(process.cwd(), "public/index.html"));
});
app.listen(3000);

Why this matters: If your assets have long TTLs but filenames are not hashed, users may get stuck with old JS/CSS after a deploy. Hashed filenames + long TTLs avoid that.

3) Don’t accidentally cache personalized HTML

HTML can be tricky. If your pages include user-specific content (dashboard, account info), caching at the CDN can leak data. A good default for authenticated pages is:

Cache-Control: private, no-store 

For public pages (marketing pages, docs), caching is great. A safe approach is to cache only when:

  • There are no session cookies
  • The response is the same for everyone
  • You can handle “stale” content for a short period

If your site is a static site or fully public content, you can cache HTML too—just keep the TTL moderate (e.g., 5–30 minutes) unless you have purging/versioning in place.

4) Verify caching with curl (look for CF-Cache-Status)

Cloudflare adds helpful headers to responses. The most useful is CF-Cache-Status:

  • HIT: served from Cloudflare cache
  • MISS: fetched from origin and (maybe) stored
  • BYPASS: intentionally not cached (rule/header)
  • EXPIRED: cache entry existed but was stale; revalidated/refetched

Try this:

curl -I https://example.com/assets/app.3f2c1a9.js 

Run it twice. The first request is often MISS; the second should become HIT if caching is working and the asset is cacheable.

5) Cache a slow API response at the edge with a Cloudflare Worker

Let’s say you have a public API endpoint that is slow but safe to cache for 60 seconds, e.g. /api/public/stats. You can use a Worker to:

  • Only cache GET requests
  • Build a safe cache key (include query string if needed)
  • Set a short TTL and return cached responses quickly

Worker example: cache /api/public/stats for 60 seconds

export default { async fetch(request, env, ctx) { const url = new URL(request.url); // Only target a specific public endpoint if (url.pathname !== "/api/public/stats") { return fetch(request); } // Only cache GET if (request.method !== "GET") { return fetch(request); } // Don’t cache if Authorization is present (defensive) if (request.headers.get("Authorization")) { return fetch(request); } // Build a cache key that includes query string const cacheKey = new Request(url.toString(), request); const cache = caches.default; // Try cache first let response = await cache.match(cacheKey); if (response) { // Add debug header so you can confirm it’s cached response = new Response(response.body, response); response.headers.set("X-Edge-Cache", "HIT"); return response; } // Fetch from origin response = await fetch(request); // Only cache successful responses if (response.ok) { // Clone because response bodies are streams const toCache = new Response(response.body, response); // Force caching for 60 seconds at the edge toCache.headers.set("Cache-Control", "public, max-age=60"); // Store asynchronously (don’t block the request) ctx.waitUntil(cache.put(cacheKey, toCache.clone())); const out = new Response(toCache.body, toCache); out.headers.set("X-Edge-Cache", "MISS"); return out; } return response; }, }; 

How to use it: Create a Worker, deploy it, and route it to your domain (e.g., example.com/* or a narrower route like example.com/api/public/*). After deployment:

curl -i https://example.com/api/public/stats 

Run it twice. You should see X-Edge-Cache: MISS then X-Edge-Cache: HIT.

Important safety notes:

  • Never cache responses that vary per-user (cookies, auth headers, personalized output).
  • Be explicit about the endpoint(s) you cache.
  • Prefer short TTLs unless you have a reliable purge/versioning strategy.

6) Add security and performance headers at the edge (simple wins)

You can also use Cloudflare to add headers that improve security and caching consistency. If you do it in a Worker, keep it minimal and predictable.

Example: add a few safe defaults

function withSecurityHeaders(response) { const headers = new Headers(response.headers); headers.set("X-Content-Type-Options", "nosniff"); headers.set("Referrer-Policy", "strict-origin-when-cross-origin"); headers.set("Permissions-Policy", "geolocation=(), microphone=(), camera=()"); return new Response(response.body, { status: response.status, headers }); }
export default {
async fetch(request) {
const res = await fetch(request);
return withSecurityHeaders(res);
},
};

This is especially handy when you don’t control the origin easily (legacy apps) or you want consistent behavior across multiple services.

7) A practical checklist for junior/mid devs

  • Static assets: hashed filenames + Cache-Control: public, max-age=31536000, immutable
  • HTML: default to no-store unless the page is truly public and safe to cache
  • APIs: cache only public, non-auth, non-user-specific endpoints; keep TTL short
  • Verification: use curl -I and check CF-Cache-Status (and your own debug headers)
  • Purge strategy: prefer “cache forever” assets with versioned filenames over frequent purges

Wrap-up

Cloudflare is most effective when you treat caching as part of your app’s design, not a magic switch. Start with aggressive caching for hashed assets (easy and safe), keep HTML caching conservative, and use Workers for targeted edge caching of public endpoints. With a couple of headers and a small Worker, you can cut load times and origin traffic dramatically—without risky “cache everything” surprises.


Leave a Reply

Your email address will not be published. Required fields are marked *