Infrastructure

Workers Caching Patterns: Cache API vs Cache Rules (When You Need Code, When You Don't)

8 min read
Split view comparing code-based Workers Cache API with visual Cache Rules configuration interface

Cloudflare offers two approaches to caching at the edge: declarative Cache Rules configured through the dashboard, and programmatic caching through the Workers Cache API. Both control what gets cached, for how long, and under what conditions—but they suit different scenarios. Choosing the wrong approach wastes either engineering time or configuration flexibility.

Cache Rules: the declarative approach

Cache Rules let you define caching behaviour through Cloudflare’s dashboard or API without writing code. You set conditions (URL patterns, headers, cookies) and actions (cache, bypass, set TTL). Cloudflare evaluates rules in order against each request and applies matching actions.

This works well for straightforward caching needs. Cache all static assets aggressively. Cache HTML with moderate TTLs. Bypass cache for authenticated users. Set specific TTLs for specific paths. These common patterns map directly to rule conditions and actions.

The advantages are significant: no deployment pipeline, no code to maintain, instant changes through the dashboard, and no execution cost. Cache Rules evaluate at the CDN level without invoking Workers, so they’re faster and cheaper per request. For most WordPress sites, Cache Rules handle caching requirements completely.

The limitation is expressiveness. Cache Rules evaluate static conditions. They can’t inspect response bodies, make conditional decisions based on computed values, or implement complex logic. If your caching needs fit “if condition then action” patterns, rules suffice. If you need “if condition then compute something then decide,” you need Workers.

Workers Cache API: the programmatic approach

The Workers Cache API gives you JavaScript code that runs on every request (or targeted requests) at the edge. You can inspect requests, modify cache keys, transform responses, set per-response cache parameters, and implement arbitrary logic.

This enables caching patterns impossible with rules alone. Cache responses based on content rather than just URL. Implement stale-while-revalidate patterns. Construct complex cache keys from multiple request attributes. Strip sensitive headers before caching. Serve partially cached responses with dynamic components.

The cost is meaningful: Workers execute code on every request that matches their route. This adds latency (typically 1-5ms) and cost (per-request pricing). The code needs maintenance, testing, and monitoring. Bugs in Workers caching logic can serve wrong content or prevent caching entirely—with application-level debugging required.

When Cache Rules are the right choice

Standard website caching—static assets, HTML pages, API responses with predictable cache behaviour—fits Cache Rules perfectly. The vast majority of sites, including complex WordPress installations, need nothing beyond what rules provide.

Performance-sensitive paths where you can’t afford the extra milliseconds of Worker execution should use rules. For extremely high-traffic pages, the per-request latency of Workers multiplied by millions of requests is meaningful. Rules evaluate without this overhead.

Operational simplicity favours rules. Non-technical team members can adjust caching behaviour through the dashboard. No code review, no deployments, no testing pipeline. Changes take effect within seconds. This operational agility matters for teams without dedicated infrastructure engineers.

Cache Rules should be your default choice. Start with rules and only add Workers when you encounter a caching requirement rules can’t express.

When Workers Cache API is necessary

Custom cache key construction beyond what rules support requires Workers. If you need to build cache keys from request body content, computed header values, or complex cookie parsing, Workers provide the necessary flexibility.

Response transformation before caching—stripping tracking parameters, normalising URLs, injecting headers, or modifying content—requires code. Workers can intercept responses from origin, modify them, and cache the modified version. Rules can’t alter response content.

Stale-while-revalidate patterns, where you serve cached content while simultaneously fetching fresh content from origin, implement elegantly in Workers. The visitor gets an instant cached response while the cache updates in the background. This balances freshness and performance in ways rules can’t express.

A/B testing at the edge, where different visitors see different cached versions based on assignment logic, needs Workers to implement the assignment and routing logic. Rules can’t implement random assignment or user-based routing.

Combining both approaches

The most effective configurations use rules for common patterns and Workers for specific exceptions. Cache Rules handle the majority of requests efficiently. Workers handle the edge cases that need custom logic.

For example: Cache Rules set default caching for all static assets and HTML pages. A Worker intercepts API requests that need custom cache key construction. Another Worker handles a specific dynamic page that requires stale-while-revalidate. Everything else flows through rules.

This layered approach minimises Workers execution (and cost) while maintaining full flexibility where needed. Most requests never invoke a Worker; only those requiring custom logic pay the execution cost.

Implementation considerations

Cache Rules have quantity limits per plan. Free plans allow limited rules; paid plans allow more. Plan your rule structure to stay within limits. Consolidate rules where possible—one rule with multiple conditions rather than multiple rules with single conditions.

Workers have execution time and memory limits. Complex caching logic must complete within time constraints. For most caching use cases, this is easily achievable. But if your Worker makes sub-requests, processes large responses, or implements complex algorithms, be aware of limits.

Testing Workers caching logic requires Cloudflare’s local development tools (Wrangler) or deployed staging environments. Unlike rules that you can verify immediately through the dashboard, Workers require development workflow. Invest in testing before deploying caching Workers to production.

Monitor both layers. Track cache hit rates from rules and Workers separately. If Workers cache logic isn’t achieving expected hit rates, debug the logic. If rules aren’t matching expected requests, review rule conditions. Different monitoring approaches apply to each layer.

The practical decision framework

Ask these questions in order: Can Cache Rules express your caching requirement? If yes, use rules. If no, do you need this caching pattern badly enough to justify Workers complexity and cost? If yes, implement in Workers. If no, reconsider whether the caching pattern is necessary.

Most sites never need Workers for caching. Standard WordPress sites, content sites, marketing sites, and even moderately complex web applications handle caching entirely through rules. Workers caching makes sense for applications with genuinely custom caching requirements—and those are rarer than you might think.

Start simple. Rules first. Workers only when rules provably can’t solve your specific problem. This approach minimises cost, complexity, and maintenance while delivering effective caching for your site. If you need help evaluating whether your caching needs warrant Workers or can be handled by rules alone, our performance team can audit your current setup and recommend the right approach.

Related insights