Architecture Proposal

Build on the web.
Not around it.

Most web applications rebuild HTTP, caching, state management, and content negotiation inside a JavaScript framework. Here's an alternative: use the web platform itself. HTML over the wire, compute at the edge, zero framework lock-in.

Stu Kennedy May 2026 Cloudflare Edge HTMX Datastar Effect TS

The web already has
an application protocol.
We should use it.

The default architecture for modern web apps is JSON APIs consumed by a JavaScript framework. It works — but it reinvents, in application code, what the web platform already provides natively.

Hypermedia-driven architecture delivers the same capabilities — typed responses, declarative auth, server-side composition, event-driven invalidation, cursor-based pagination — by making HTML the primary wire format and using HTTP as it was designed.

Instead of building a codegen pipeline, a typed client library, custom framework hooks, and an API seam to mediate between them, you ship HTML fragments from the edge and let the browser's native hypermedia engine do the rest.

Core claim: For most web applications, a hypermedia-first architecture achieves the same operational outcomes as a bespoke JSON-RPC layer — with significantly less infrastructure, less code, less framework risk, and faster delivery.
~30kb
Client JS bundle
~50%
Less application code
~0
Framework lock-in
Edge
Global compute

The default: JSON → Client

  • Schema → codegen → typed client → framework hooks → API seam → backend
  • Client builds HTML from JSON
  • Client manages cache, retry, invalidation
  • Client composes state from multiple responses
  • Framework-coupled (React, Next.js, TanStack)
  • Bundle ships JS for every interaction

The alternative: Hypermedia

  • Route → handler → HTML fragment → browser
  • Server builds HTML from data
  • HTTP caching + HTMX invalidation
  • Server composes — always one request
  • Framework-free frontend (HTMX is 14kb)
  • Zero client JS for most interactions

What you're actually
paying for.

When a team chooses a JSON-first architecture with a heavy client framework, here's what they're actually building and maintaining — often without realising the accumulated cost.

1. A state management layer

TanStack Query, Redux, Zustand — all exist because JSON doesn't tell the browser how to render itself. The client has to parse, interpret, cache, invalidate, and re-fetch. That's a state machine you maintain forever.

2. A codegen pipeline

OpenAPI → generated types → generated hooks → versioned packages. Works great until the schema changes, the codegen breaks, or the generated code doesn't match what the backend actually does.

3. A rendering engine

React, Vue, Svelte — they exist to solve a problem that HTML already solves: turning data into pixels. When the server sends HTML instead of JSON, the browser's built-in rendering engine does this for free.

4. A composition layer

One screen = multiple API calls → client-side composition → loading states → error boundaries → stale-while-revalidate. The server could compose all of this into a single HTML response.

5. A cache coordination system

Client caches go stale. Invalidations need to propagate. Realtime updates need to trigger re-fetches. HTMX + SSE handles this declaratively — the server pushes an event, the affected fragment re-fetches.

6. Framework upgrades as a lifestyle

React 16 → 17 → 18 → 19. Next.js Pages → App Router → RSC. Each migration is weeks of engineering time. HTMX hasn't had a breaking change in years. HTML is forever.

The accumulated cost isn't just development time. It's cognitive load on every engineer, slower onboarding for new team members, deployment coupling between frontend and backend, and a perpetual dependency on framework maintainers' decisions.

HTML over the wire.
Compute at the edge.

The server is not a JSON vending machine. It's a hypermedia engine that responds to HTTP requests with HTML fragments the browser can render directly — or JSON when a non-HTML consumer (mobile app, CLI, third-party) needs it.

Browser Layer
HTMX
Datastar
Alpine.js
↕ HTTP (HTML fragments)  ·  SSE (realtime signals)
Edge Layer — Cloudflare Workers
Hono Router
Auth Middleware
JSX Rendering
Effect TS
Service Layer
Effect TS
Drizzle ORM
Durable Objects
Data Layer
D1 (SQLite)
KV
R2

How a request flows

User clicks / action HTMX sends HTTP request Hono resolves actor Policy middleware Handler queries data JSX → HTML fragment HTMX swaps into DOM
No client-side state management. The server is the source of truth. The browser renders what it receives. Interaction triggers HTTP requests. The server responds with updated HTML. This is the web working as designed.

The tools, not the rules.

Each piece is chosen because it's the right tool for a hypermedia architecture on Cloudflare's edge platform. No polyfills, no workarounds, no "we'll migrate later."

Hono

The edge-native HTTP framework. Type-safe routing, middleware chains, JSX rendering — all on Cloudflare Workers. Sub-millisecond cold start. Replaces Express, Next.js Route Handlers, and tRPC in a single, purpose-built package.

routing middleware JSX rendering

HTMX

14kb of JavaScript that extends HTML with hypermedia controls. hx-get, hx-post, hx-swap — declarative attributes for server-driven UI updates. The browser becomes a fully-capable hypermedia client without a heavy client-side framework.

HTML over wire declarative 14kb

Datastar

Realtime reactive UI without a JavaScript framework. SSE-native — the server pushes HTML fragment updates and the browser applies them. Replaces WebSocket subscriptions and client-side state sync with a single persistent SSE connection.

SSE reactive realtime

D1 + Drizzle

Cloudflare's SQLite database with Drizzle as the type-safe ORM. Schema-first, migration-native, zero-latency from Workers. Cursor-based pagination is a native query pattern, not an abstraction layer.

SQLite type-safe ORM edge-native

Durable Objects

Single-instance stateful compute at the edge. Perfect for: session management, presence tracking, long-running operations, rate limiting, and real-time coordination. Each DO is a consistent state machine with sub-millisecond access.

stateful single-instance edge

Effect TS

Production-grade TypeScript library for building type-safe, composable, observable services. Tagged error unions, typed service layers (dependency injection), structured concurrency, and observability built into every pipeline.

error taxonomy service layers concurrency observability

Three operations.
That's all you need.

Every web application boils down to three things: reading data, writing data, and receiving live updates. In a hypermedia architecture, these map directly to HTTP verbs and SSE.

Queries → GET returning HTML

A query is an HTTP GET that returns an HTML fragment. The browser caches it via standard HTTP cache headers. HTMX issues the request and swaps the result into the DOM.

// Dashboard panel — one GET, one HTML response
<div
  hx-get="/api/dashboard/metrics"
  hx-trigger="load, metrics.updated from /sse/dashboard"
  hx-swap="innerHTML"
>
  Loading metrics...
</div>

The server responds with a complete HTML fragment — the composed view of whatever data that panel needs. No client-side composition. The hx-trigger with from causes automatic refresh when an SSE event fires.

Commands → POST with idempotency

A command is an HTTP POST (or PUT for idempotent updates). The server processes it and returns an HTML fragment showing the updated state. Idempotency keys travel as HTTP headers.

// Update settings — form POST, returns updated panel
<form
  hx-post="/api/settings"
  hx-headers='{"idempotency-key": "{{uuid}}"}'
  hx-target="#settings-panel"
  hx-swap="innerHTML"
>
  <input name="key" type="text"/>
  <textarea name="value"></textarea>
  <button>Save</button>
</form>

The server returns the updated panel as HTML. The invalidation model is implicit — the response replaces the stale UI. Other fragments listening on the same SSE channel auto-refresh.

Subscriptions → SSE via Datastar

Realtime updates flow over Server-Sent Events. Datastar manages the connection, reconnect, and DOM patching. No WebSocket libraries, no client-side state sync.

// Live feed — SSE pushes fragment updates
<div
  data-on-load="$$sse('/api/feed/live')"
  sse-swap="feed-update"
>
  <div id="items">
    <!-- Server pushes updated items as HTML -->
  </div>
</div>

The server pushes HTML fragments over SSE. Cursors and replay are handled server-side via Durable Objects — the DO maintains the event log and replays from cursor on reconnect.

The architectural difference: Three HTTP verbs (GET, POST, PUT) and one delivery mechanism (SSE) cover everything a web application needs. No codegen, no client state management, no framework hooks. The server renders, the browser displays.

Durable Objects + SSE.
The right tool for each job.

Most teams build a subscription system from scratch — channels, cursors, replay, backpressure, auth re-check on reconnect. Cloudflare's platform gives you most of this for free.

Durable Objects for stateful coordination

Each live entity (user session, notification feed, collaborative document) is backed by a Durable Object. The DO owns:

  • The authoritative event log (cursor source)
  • Presence tracking (who's connected)
  • Backpressure (batching for slow clients)
  • Auth verification on each reconnect

SSE for delivery

Each client holds one SSE connection to its DO-backed endpoint. Benefits:

  • Native browser support — no libraries
  • Auto-reconnect with Last-Event-ID for cursor resume
  • Server pushes HTML fragment updates
  • Subscription caps enforced at the DO

Cursor replay — built in

SSE's native Last-Event-ID header gives you cursor replay for free. The Durable Object maintains an in-memory event log (configurable TTL per channel). On reconnect, the browser sends its last event ID; the DO replays missed events. Events beyond the TTL trigger a snapshot handshake — a "query first, then subscribe" pattern without any client code.

Typical subscription channels

Channel DO Namespace Cursor TTL Delivery
session.live SessionDO 2 hours SSE → HTML fragments
notifications NotificationDO 24 hours SSE → toast HTML
entity.status EntityDO 1 hour SSE → HTMX trigger
collab.edit RoomDO 6 hours SSE → HTMX trigger

Typed, composable service layer.

Effect TS is the backbone of the server-side logic. It doesn't replace Hono or Drizzle — it composes them. Every handler, every database query, every side effect runs inside an Effect pipeline that provides typed errors, structured concurrency, dependency injection, and observability for free.

1. Typed error union — compiler-enforced error handling

Tagged error classes with exhaustive pattern matching. The compiler catches missing cases.

class NotFoundError {
  readonly _tag = "NotFoundError";
  constructor(readonly entity: string, readonly id: string) {}
}

class RateLimitedError {
  readonly _tag = "RateLimitedError";
  constructor(readonly retryAfter: number) {}
}

// Exhaustive matching — compiler catches missing cases
const renderError = Effect.catchTags({
  NotFoundError: (e) => renderNotFound(e.entity, e.id),
  RateLimitedError: (e) => renderRateLimited(e.retryAfter),
  ValidationError: (e) => renderValidation(e.fields),
  ...// missing a tag? Compile error.
});

2. Service layers — composable dependency injection

Each domain service is an Effect Layer. Handlers declare dependencies; the runtime wires them. Test mode swaps in fixture layers without changing a single handler.

// Service declaration
class UserService extends Effect.Service<"UserService">()({
  getProfile: Effect.fn((id: string) =>
    Effect.gen(function* () {
      const db = yield* Database;
      const cache = yield* CacheService;
      const user = yield* cache
        .getOrFetch(`user:${id}`, () =>
          db.query.users.findById(id)
        );
      if (!user)
        yield* Effect.fail(new NotFoundError("user", id));
      return user;
    })
  ),
}) {}

// Handler uses it — no imports, no wiring
const handler = Effect.gen(function* () {
  const svc = yield* UserService;
  const profile = yield* svc.getProfile(id);
  return renderProfile(profile);
});

3. Retry & resilience — server-side, not client-side

Keep retry on the server — where it belongs. Typed retry schedules per error class. Transient DB failures get retried; validation errors don't.

// Retry transient DB failures, not validation errors
const safeQuery = db.query.users
  .findById(id)
  .pipe(
    Effect.retry(
      Schedule.exponential("100 millis")
        .pipe(Schedule.jittered)
        .pipe(Schedule.whileOutput(_ => _ < "5 seconds"))
    ),
    Effect.catchTag("DatabaseError", () =>
      Effect.fail(new DegradedError("user_data"))
    )
  );

4. Observability — built into every pipeline

Per-operation metrics (latency, error code, cache hit) come free when you add Effect.withSpan to any pipeline. Zero manual logging required.

// Every handler auto-instruments
app.get("/api/dashboard/metrics",
  authenticate(),
  authorize('admin', 'viewer'),
  async (c) => {
    const program = Effect.gen(function* () {
      const svc = yield* MetricsService;
      const data = yield* svc.getDashboard(c.req.param('id'));
      return c.html(<MetricsPanel {...data} />);
    }).pipe(
      Effect.withSpan("metrics.getDashboard", {
        attributes: { operation: "metrics.getDashboard" }
      }),
      renderError
    );
    return runHandler(program);
  }
);
Test mode as a layer swap: UserService.Live uses D1; UserService.Test uses an in-memory store with fixture injection. Same handlers, same routes, same type signatures — different backing services. Time control is a Clock layer swap. Error simulation is a service wrapper. Zero new routes needed.

Meaningful responses,
not error codes.

In a JSON architecture, errors are typed objects the client maps to UI components. In a hypermedia architecture, the server renders the response directly — so errors become HTML fragments that mean something to the user.

Soft validation — let the request through

Traditional validation middleware rejects invalid input with a 400 before the handler runs. The client then has to parse the error, map it to UI state, and re-render. Soft-validation takes a different approach: validation issues are collected as typed context and passed into the handler. The handler renders them inline.

// Validation middleware — collects, doesn't reject
app.post("/api/items",
  softValidate(CreateItemSchema),
  async (c) => {
    const { issues, data } = c.get('validation');

    if (issues.length > 0) {
      // Return HTML with inline errors — same form, error messages rendered
      return c.html(
        <ItemForm data={data} errors={issues} />,
        { status: 422 }
      );
    }

    // Happy path — create item, return updated list
    const item = await createItem(data);
    return c.html(<ItemList items={items} />);
  }
);

The user sees the same form with error messages highlighted next to the relevant fields. No client-side error parsing, no state management for "form has errors" — the server just re-renders the form with the issues visible.

Every error is a rendered response

// Effect error → HTML fragment mapping
const renderError = Effect.catchTags({
  NotFoundError: (e) =>
    html(<EmptyState
      title={`No ${e.entity} found`}
      hint="This may have been removed or the link is incorrect."
    />),

  ValidationError: (e) =>
    html(<FormErrors fields={e.fields} />),

  RateLimitedError: (e) =>
    html(<RateLimitBanner retryAfter={e.retryAfter} />),

  DegradedError: (e) =>
    html(<PartialDataWarning source={e.source} />),

  ForbiddenError: (e) =>
    html(<NoPermission required={e.requiredRole} />),
});

Each error becomes an HTML fragment that replaces the relevant part of the page via HTMX swap. The user sees a helpful message, not a broken page or a toast with a cryptic code.

The principle: In a hypermedia architecture, error handling is just rendering. The server always returns HTML — success HTML, validation-error HTML, permission-error HTML, or degraded-data HTML. Every response is a meaningful UI.

Same requirements.
Less complexity.

How each architecture addresses the same operational requirements.

Requirement JSON-first (Default) Hypermedia-first
Wire format JSON over RPC HTML fragments + SSE
Client state TanStack Query / Redux cache DOM is state — no cache layer
Invalidation Event signals → client re-fetch SSE event → HTMX auto-refetch
Composition One composite endpoint per screen One GET returning composed HTML
Error handling Typed JSON errors → client switch Effect tagged errors → HTML error fragments
Retry & resilience Retry metadata → client retry Effect retry schedules — server-side
Auth & sessions Cookie → context → policy gates Cookie → Hono middleware → policy gates
Pagination Cursor in JSON response Cursor in hx-next header / next-page link
Realtime WebSocket + cursor protocol SSE + Durable Objects + Last-Event-ID
Caching Client cache + edge CDN HTTP Cache-Control + edge CDN
Codegen Required — schema → client types None — types live in Hono routes + Drizzle
Client JS bundle React + TanStack + hooks (~180kb) HTMX + Datastar + Alpine (~30kb)
Framework coupling Next.js / React / Vue — locked in None — any HTML server works
Testing Fixtures in separate package Layer swap — same routes, test backing
Mobile support Separate API or shared contract Content negotiation — same handler, JSON or HTML
The pattern: Every requirement maps 1:1. The difference is whether you build application-level infrastructure (codegen, typed clients, cache layers, state managers) or use the platform's native capabilities (HTTP caching, SSE, DOM as state, HTML as wire format). Both work. One has fewer moving parts.

Show, don't tell.

The same feature — a dashboard panel with live metrics and settings. How it's built in each architecture.

JSON-first: ~40 lines across 4 files

// 1. Schema (packages/api-contracts/operations/dashboard.ts)
export const dashboardGetMetrics = {
  input: z.object({ dashboardId: z.string() }),
  output: z.object({ metrics: MetricsSummary, settings: Settings, ... }),
  policy: { roles: ['admin', 'viewer'] },
  cache: 'private',
};

// 2. Codegen'd hook (auto-generated)
export function useDashboardGetMetrics(input) { ... }

// 3. Feature component
function MetricsPanel() {
  const { data, error } = useDashboardGetMetrics({ dashboardId });
  if (error) return <ErrorBoundary error={error} />;
  return <Panel metrics={data.metrics} settings={data.settings} ... />;
}

// 4. API endpoint (seam)
export async function POST(req) {
  const actor = await resolveActor(req);
  const input = schema.parse(await req.json());
  enforcePolicy(actor, schema.policy);
  return backend.dashboard.getMetrics(input, actor);
}

Hypermedia: ~20 lines across 2 files

// 1. HTML (the "component")
<div
  hx-get="/api/dashboard/metrics"
  hx-trigger="load, metrics.updated from /sse/dashboard"
  hx-swap="innerHTML"
>
  Loading metrics...
</div>

// 2. Hono route handler (the "backend")
app.get("/api/dashboard/metrics",
  authenticate(),
  authorize('admin', 'viewer'),
  async (c) => {
    const program = Effect.gen(function* () {
      const svc = yield* MetricsService;
      const data = yield* svc.getDashboard(c.req.param('id'));
      return c.html(<MetricsPanel {...data} />);
    }).pipe(
      Effect.withSpan("dashboard.getMetrics"),
      renderError
    );
    return runHandler(program);
  }
);
Half the code. Zero codegen. Zero client state. The HTML fragment IS the contract. The route IS the type boundary. Hono's type inference gives compile-time safety on routes and params. Drizzle gives compile-time safety on queries.

Not theory. Production experience.

This architecture isn't assembled from blog posts. Every piece of it comes from shipping real systems and learning where the default approach breaks down.

HTMX Core Contributor

Built the WebSocket support for HTMX 4. Understanding hypermedia at the protocol level — not just how to use HTMX, but why it works and where the edges are — shaped the thinking behind this approach.

Realtime Systems

25+ years building low-latency audio/video pipelines and streaming architectures. The SSE + Durable Objects approach comes from production experience with sub-second delivery requirements — not theoretical latency budgets.

Cloudflare Native

Workers, Durable Objects, D1, R2, KV — shipping production systems on Cloudflare's platform daily. This architecture uses Cloudflare's actual capabilities, not aspirational ones from the docs.

AI / Agentic Systems

Voice agent platforms, sub-350ms ASR pipelines, realtime AI video training — the AI workload requirements that most architecture proposals ignore are baked into this approach from the start.

The point: Every piece of this architecture — Hono, HTMX, Datastar, Effect TS, Durable Objects, D1 — is something I use in production today. This isn't a thought experiment. It's what I reach for when the default approach gets too heavy.

Honest about the trade-offs.

No architecture is perfect. Here are the genuine considerations.

1. Frontend skill shift

Teams that are React-native will need to learn a different mental model. HTMX is far simpler than React + TanStack Query, the learning curve is short, and the productivity gains are immediate once internalised — but it IS a shift.

2. Rich interactions need JS supplements

Complex UIs (drag-and-drop, rich text editing, interactive canvases) need Alpine.js or light JS supplements. This is normal in hypermedia architectures — "JS as a supplement, not the foundation."

3. Type safety across the wire

JSON-first gives you compile-time type safety from schema to client hook. Hypermedia trades this for HTTP-level contracts: the route is typed (Hono), the query is typed (Drizzle), but the HTML wire format is not. Weaker type safety at the boundary, but zero codegen complexity.

4. Mobile and third-party consumers

The same Hono routes return JSON when Accept: application/json is sent. Content negotiation is built in — the same handler produces HTML or JSON based on the Accept header. No separate API layer needed.

5. Incremental adoption

You don't have to go all-in. HTMX works alongside existing JavaScript. You can adopt this architecture route by route — start with one panel, one form, one feature. The server-driven model coexists with client-side rendering during migration.

One screen.
See for yourself.

Don't take my word for it. Pick one screen — something with realtime updates, auth, and error handling. I'll build it both ways: the default Next.js approach and the hypermedia approach. Same requirements, same design, side by side.

The bet: the hypermedia version ships faster, has fewer lines of code, zero framework lock-in, and a smaller bundle. If it doesn't, this whole document is wrong — and you'll know for certain.