Most web applications rebuild HTTP, caching, state management, and content negotiation inside a JavaScript framework. Here's an alternative: use the web platform itself. HTML over the wire, compute at the edge, zero framework lock-in.
The default architecture for modern web apps is JSON APIs consumed by a JavaScript framework. It works — but it reinvents, in application code, what the web platform already provides natively.
Hypermedia-driven architecture delivers the same capabilities — typed responses, declarative auth, server-side composition, event-driven invalidation, cursor-based pagination — by making HTML the primary wire format and using HTTP as it was designed.
Instead of building a codegen pipeline, a typed client library, custom framework hooks, and an API seam to mediate between them, you ship HTML fragments from the edge and let the browser's native hypermedia engine do the rest.
When a team chooses a JSON-first architecture with a heavy client framework, here's what they're actually building and maintaining — often without realising the accumulated cost.
TanStack Query, Redux, Zustand — all exist because JSON doesn't tell the browser how to render itself. The client has to parse, interpret, cache, invalidate, and re-fetch. That's a state machine you maintain forever.
OpenAPI → generated types → generated hooks → versioned packages. Works great until the schema changes, the codegen breaks, or the generated code doesn't match what the backend actually does.
React, Vue, Svelte — they exist to solve a problem that HTML already solves: turning data into pixels. When the server sends HTML instead of JSON, the browser's built-in rendering engine does this for free.
One screen = multiple API calls → client-side composition → loading states → error boundaries → stale-while-revalidate. The server could compose all of this into a single HTML response.
Client caches go stale. Invalidations need to propagate. Realtime updates need to trigger re-fetches. HTMX + SSE handles this declaratively — the server pushes an event, the affected fragment re-fetches.
React 16 → 17 → 18 → 19. Next.js Pages → App Router → RSC. Each migration is weeks of engineering time. HTMX hasn't had a breaking change in years. HTML is forever.
The server is not a JSON vending machine. It's a hypermedia engine that responds to HTTP requests with HTML fragments the browser can render directly — or JSON when a non-HTML consumer (mobile app, CLI, third-party) needs it.
Each piece is chosen because it's the right tool for a hypermedia architecture on Cloudflare's edge platform. No polyfills, no workarounds, no "we'll migrate later."
The edge-native HTTP framework. Type-safe routing, middleware chains, JSX rendering — all on Cloudflare Workers. Sub-millisecond cold start. Replaces Express, Next.js Route Handlers, and tRPC in a single, purpose-built package.
14kb of JavaScript that extends HTML with hypermedia controls. hx-get, hx-post, hx-swap — declarative attributes for server-driven UI updates. The browser becomes a fully-capable hypermedia client without a heavy client-side framework.
Realtime reactive UI without a JavaScript framework. SSE-native — the server pushes HTML fragment updates and the browser applies them. Replaces WebSocket subscriptions and client-side state sync with a single persistent SSE connection.
Cloudflare's SQLite database with Drizzle as the type-safe ORM. Schema-first, migration-native, zero-latency from Workers. Cursor-based pagination is a native query pattern, not an abstraction layer.
Single-instance stateful compute at the edge. Perfect for: session management, presence tracking, long-running operations, rate limiting, and real-time coordination. Each DO is a consistent state machine with sub-millisecond access.
Production-grade TypeScript library for building type-safe, composable, observable services. Tagged error unions, typed service layers (dependency injection), structured concurrency, and observability built into every pipeline.
Every web application boils down to three things: reading data, writing data, and receiving live updates. In a hypermedia architecture, these map directly to HTTP verbs and SSE.
GET returning HTMLA query is an HTTP GET that returns an HTML fragment. The browser caches it via standard HTTP cache headers. HTMX issues the request and swaps the result into the DOM.
// Dashboard panel — one GET, one HTML response <div hx-get="/api/dashboard/metrics" hx-trigger="load, metrics.updated from /sse/dashboard" hx-swap="innerHTML" > Loading metrics... </div>
The server responds with a complete HTML fragment — the composed view of whatever data that panel needs. No client-side composition. The hx-trigger with from causes automatic refresh when an SSE event fires.
POST with idempotencyA command is an HTTP POST (or PUT for idempotent updates). The server processes it and returns an HTML fragment showing the updated state. Idempotency keys travel as HTTP headers.
// Update settings — form POST, returns updated panel <form hx-post="/api/settings" hx-headers='{"idempotency-key": "{{uuid}}"}' hx-target="#settings-panel" hx-swap="innerHTML" > <input name="key" type="text"/> <textarea name="value"></textarea> <button>Save</button> </form>
The server returns the updated panel as HTML. The invalidation model is implicit — the response replaces the stale UI. Other fragments listening on the same SSE channel auto-refresh.
Realtime updates flow over Server-Sent Events. Datastar manages the connection, reconnect, and DOM patching. No WebSocket libraries, no client-side state sync.
// Live feed — SSE pushes fragment updates <div data-on-load="$$sse('/api/feed/live')" sse-swap="feed-update" > <div id="items"> <!-- Server pushes updated items as HTML --> </div> </div>
The server pushes HTML fragments over SSE. Cursors and replay are handled server-side via Durable Objects — the DO maintains the event log and replays from cursor on reconnect.
Most teams build a subscription system from scratch — channels, cursors, replay, backpressure, auth re-check on reconnect. Cloudflare's platform gives you most of this for free.
Each live entity (user session, notification feed, collaborative document) is backed by a Durable Object. The DO owns:
Each client holds one SSE connection to its DO-backed endpoint. Benefits:
SSE's native Last-Event-ID header gives you cursor replay for free. The Durable Object maintains an in-memory event log (configurable TTL per channel). On reconnect, the browser sends its last event ID; the DO replays missed events. Events beyond the TTL trigger a snapshot handshake — a "query first, then subscribe" pattern without any client code.
| Channel | DO Namespace | Cursor TTL | Delivery |
|---|---|---|---|
session.live |
SessionDO | 2 hours | SSE → HTML fragments |
notifications |
NotificationDO | 24 hours | SSE → toast HTML |
entity.status |
EntityDO | 1 hour | SSE → HTMX trigger |
collab.edit |
RoomDO | 6 hours | SSE → HTMX trigger |
Effect TS is the backbone of the server-side logic. It doesn't replace Hono or Drizzle — it composes them. Every handler, every database query, every side effect runs inside an Effect pipeline that provides typed errors, structured concurrency, dependency injection, and observability for free.
Tagged error classes with exhaustive pattern matching. The compiler catches missing cases.
class NotFoundError { readonly _tag = "NotFoundError"; constructor(readonly entity: string, readonly id: string) {} } class RateLimitedError { readonly _tag = "RateLimitedError"; constructor(readonly retryAfter: number) {} } // Exhaustive matching — compiler catches missing cases const renderError = Effect.catchTags({ NotFoundError: (e) => renderNotFound(e.entity, e.id), RateLimitedError: (e) => renderRateLimited(e.retryAfter), ValidationError: (e) => renderValidation(e.fields), ...// missing a tag? Compile error. });
Each domain service is an Effect Layer. Handlers declare dependencies; the runtime wires them. Test mode swaps in fixture layers without changing a single handler.
// Service declaration class UserService extends Effect.Service<"UserService">()({ getProfile: Effect.fn((id: string) => Effect.gen(function* () { const db = yield* Database; const cache = yield* CacheService; const user = yield* cache .getOrFetch(`user:${id}`, () => db.query.users.findById(id) ); if (!user) yield* Effect.fail(new NotFoundError("user", id)); return user; }) ), }) {} // Handler uses it — no imports, no wiring const handler = Effect.gen(function* () { const svc = yield* UserService; const profile = yield* svc.getProfile(id); return renderProfile(profile); });
Keep retry on the server — where it belongs. Typed retry schedules per error class. Transient DB failures get retried; validation errors don't.
// Retry transient DB failures, not validation errors const safeQuery = db.query.users .findById(id) .pipe( Effect.retry( Schedule.exponential("100 millis") .pipe(Schedule.jittered) .pipe(Schedule.whileOutput(_ => _ < "5 seconds")) ), Effect.catchTag("DatabaseError", () => Effect.fail(new DegradedError("user_data")) ) );
Per-operation metrics (latency, error code, cache hit) come free when you add Effect.withSpan to any pipeline. Zero manual logging required.
// Every handler auto-instruments app.get("/api/dashboard/metrics", authenticate(), authorize('admin', 'viewer'), async (c) => { const program = Effect.gen(function* () { const svc = yield* MetricsService; const data = yield* svc.getDashboard(c.req.param('id')); return c.html(<MetricsPanel {...data} />); }).pipe( Effect.withSpan("metrics.getDashboard", { attributes: { operation: "metrics.getDashboard" } }), renderError ); return runHandler(program); } );
UserService.Live uses D1; UserService.Test uses an in-memory store with fixture injection. Same handlers, same routes, same type signatures — different backing services. Time control is a Clock layer swap. Error simulation is a service wrapper. Zero new routes needed.
In a JSON architecture, errors are typed objects the client maps to UI components. In a hypermedia architecture, the server renders the response directly — so errors become HTML fragments that mean something to the user.
Traditional validation middleware rejects invalid input with a 400 before the handler runs. The client then has to parse the error, map it to UI state, and re-render. Soft-validation takes a different approach: validation issues are collected as typed context and passed into the handler. The handler renders them inline.
// Validation middleware — collects, doesn't reject app.post("/api/items", softValidate(CreateItemSchema), async (c) => { const { issues, data } = c.get('validation'); if (issues.length > 0) { // Return HTML with inline errors — same form, error messages rendered return c.html( <ItemForm data={data} errors={issues} />, { status: 422 } ); } // Happy path — create item, return updated list const item = await createItem(data); return c.html(<ItemList items={items} />); } );
The user sees the same form with error messages highlighted next to the relevant fields. No client-side error parsing, no state management for "form has errors" — the server just re-renders the form with the issues visible.
// Effect error → HTML fragment mapping const renderError = Effect.catchTags({ NotFoundError: (e) => html(<EmptyState title={`No ${e.entity} found`} hint="This may have been removed or the link is incorrect." />), ValidationError: (e) => html(<FormErrors fields={e.fields} />), RateLimitedError: (e) => html(<RateLimitBanner retryAfter={e.retryAfter} />), DegradedError: (e) => html(<PartialDataWarning source={e.source} />), ForbiddenError: (e) => html(<NoPermission required={e.requiredRole} />), });
Each error becomes an HTML fragment that replaces the relevant part of the page via HTMX swap. The user sees a helpful message, not a broken page or a toast with a cryptic code.
How each architecture addresses the same operational requirements.
| Requirement | JSON-first (Default) | Hypermedia-first |
|---|---|---|
| Wire format | JSON over RPC | HTML fragments + SSE |
| Client state | TanStack Query / Redux cache | DOM is state — no cache layer |
| Invalidation | Event signals → client re-fetch | SSE event → HTMX auto-refetch |
| Composition | One composite endpoint per screen | One GET returning composed HTML |
| Error handling | Typed JSON errors → client switch | Effect tagged errors → HTML error fragments |
| Retry & resilience | Retry metadata → client retry | Effect retry schedules — server-side |
| Auth & sessions | Cookie → context → policy gates | Cookie → Hono middleware → policy gates |
| Pagination | Cursor in JSON response | Cursor in hx-next header / next-page link |
| Realtime | WebSocket + cursor protocol | SSE + Durable Objects + Last-Event-ID |
| Caching | Client cache + edge CDN | HTTP Cache-Control + edge CDN |
| Codegen | Required — schema → client types | None — types live in Hono routes + Drizzle |
| Client JS bundle | React + TanStack + hooks (~180kb) | HTMX + Datastar + Alpine (~30kb) |
| Framework coupling | Next.js / React / Vue — locked in | None — any HTML server works |
| Testing | Fixtures in separate package | Layer swap — same routes, test backing |
| Mobile support | Separate API or shared contract | Content negotiation — same handler, JSON or HTML |
The same feature — a dashboard panel with live metrics and settings. How it's built in each architecture.
// 1. Schema (packages/api-contracts/operations/dashboard.ts) export const dashboardGetMetrics = { input: z.object({ dashboardId: z.string() }), output: z.object({ metrics: MetricsSummary, settings: Settings, ... }), policy: { roles: ['admin', 'viewer'] }, cache: 'private', }; // 2. Codegen'd hook (auto-generated) export function useDashboardGetMetrics(input) { ... } // 3. Feature component function MetricsPanel() { const { data, error } = useDashboardGetMetrics({ dashboardId }); if (error) return <ErrorBoundary error={error} />; return <Panel metrics={data.metrics} settings={data.settings} ... />; } // 4. API endpoint (seam) export async function POST(req) { const actor = await resolveActor(req); const input = schema.parse(await req.json()); enforcePolicy(actor, schema.policy); return backend.dashboard.getMetrics(input, actor); }
// 1. HTML (the "component") <div hx-get="/api/dashboard/metrics" hx-trigger="load, metrics.updated from /sse/dashboard" hx-swap="innerHTML" > Loading metrics... </div> // 2. Hono route handler (the "backend") app.get("/api/dashboard/metrics", authenticate(), authorize('admin', 'viewer'), async (c) => { const program = Effect.gen(function* () { const svc = yield* MetricsService; const data = yield* svc.getDashboard(c.req.param('id')); return c.html(<MetricsPanel {...data} />); }).pipe( Effect.withSpan("dashboard.getMetrics"), renderError ); return runHandler(program); } );
This architecture isn't assembled from blog posts. Every piece of it comes from shipping real systems and learning where the default approach breaks down.
Built the WebSocket support for HTMX 4. Understanding hypermedia at the protocol level — not just how to use HTMX, but why it works and where the edges are — shaped the thinking behind this approach.
25+ years building low-latency audio/video pipelines and streaming architectures. The SSE + Durable Objects approach comes from production experience with sub-second delivery requirements — not theoretical latency budgets.
Workers, Durable Objects, D1, R2, KV — shipping production systems on Cloudflare's platform daily. This architecture uses Cloudflare's actual capabilities, not aspirational ones from the docs.
Voice agent platforms, sub-350ms ASR pipelines, realtime AI video training — the AI workload requirements that most architecture proposals ignore are baked into this approach from the start.
No architecture is perfect. Here are the genuine considerations.
Teams that are React-native will need to learn a different mental model. HTMX is far simpler than React + TanStack Query, the learning curve is short, and the productivity gains are immediate once internalised — but it IS a shift.
Complex UIs (drag-and-drop, rich text editing, interactive canvases) need Alpine.js or light JS supplements. This is normal in hypermedia architectures — "JS as a supplement, not the foundation."
JSON-first gives you compile-time type safety from schema to client hook. Hypermedia trades this for HTTP-level contracts: the route is typed (Hono), the query is typed (Drizzle), but the HTML wire format is not. Weaker type safety at the boundary, but zero codegen complexity.
The same Hono routes return JSON when Accept: application/json is sent. Content negotiation is built in — the same handler produces HTML or JSON based on the Accept header. No separate API layer needed.
You don't have to go all-in. HTMX works alongside existing JavaScript. You can adopt this architecture route by route — start with one panel, one form, one feature. The server-driven model coexists with client-side rendering during migration.
Don't take my word for it. Pick one screen — something with realtime updates, auth, and error handling. I'll build it both ways: the default Next.js approach and the hypermedia approach. Same requirements, same design, side by side.
The bet: the hypermedia version ships faster, has fewer lines of code, zero framework lock-in, and a smaller bundle. If it doesn't, this whole document is wrong — and you'll know for certain.