Back to Deep Dives
Real-TimeStreamingNext.js 16

Real-Time and Streaming UI Patterns in Next.js 16: SSE, WebSockets, and Live Data with React 19

05/202618 min read
Share
Real-Time and Streaming UI Patterns in Next.js 16: SSE, WebSockets, and Live Data with React 19

Why Real-Time Is the New Baseline

Users in 2026 don't wait for page refreshes. They expect typing indicators, live cursors, instant notifications, and AI responses that stream in token-by-token. Real-time isn't a luxury feature anymore—it's table stakes.

But Next.js 16 doesn't ship with a built-in real-time primitive. There's no magic useRealtime hook. Instead, the framework gives you the building blocks—Route Handlers, the Web Streams API, and React 19's streaming capabilities—and expects you to compose them correctly.

This deep dive covers six production-ready patterns for real-time in Next.js 16: from server-sent events and WebSockets to AI token streaming, live presence, and resilient reconnection. Each pattern is self-contained, battle-tested, and ready to drop into your codebase.

  • Server-Sent Events via Route Handlers for one-way data push
  • WebSocket integration alongside the App Router
  • ReadableStream for AI-style token-by-token delivery
  • Optimistic presence with shared state
  • Live data polling with React 19 transitions
  • Resilient reconnection with exponential backoff

#1 Server-Sent Events with Route Handlers

SSE is the simplest way to push data from server to client. Unlike WebSockets, it works over plain HTTP, passes through every proxy and CDN, and reconnects automatically. For most real-time needs — notifications, live feeds, progress updates — SSE is the right choice.

Creating a Streaming Route Handler

Next.js 16 Route Handlers support returning a ReadableStream directly. This is the foundation for SSE.

app/api/events/route.ts
export const runtime = 'nodejs'; export async function GET() { const encoder = new TextEncoder(); const stream = new ReadableStream({ start(controller) { // Send a heartbeat every 30s to keep the connection alive const heartbeat = setInterval(() => { controller.enqueue(encoder.encode(': heartbeat\n\n')); }, 30_000); // Simulate live events (replace with your data source) const source = subscribeToEvents((event) => { const data = JSON.stringify(event); controller.enqueue( encoder.encode(`event: ${event.type}\ndata: ${data}\n\n`) ); }); // Cleanup when the client disconnects return () => { clearInterval(heartbeat); source.unsubscribe(); }; }, }); return new Response(stream, { headers: { 'Content-Type': 'text/event-stream', 'Cache-Control': 'no-cache, no-transform', Connection: 'keep-alive', }, }); }

Consuming SSE in a Client Component

The browser's native EventSource API handles reconnection automatically—no library needed.

hooks/useServerEvents.ts
'use client'; import { useEffect, useState } from 'react'; interface ServerEvent { type: string; payload: unknown; timestamp: number; } export function useServerEvents(url: string) { const [events, setEvents] = useState<ServerEvent[]>([]); const [status, setStatus] = useState<'connecting' | 'open' | 'closed'>('connecting'); useEffect(() => { const source = new EventSource(url); source.onopen = () => setStatus('open'); source.onerror = () => setStatus('connecting'); source.addEventListener('notification', (e) => { const event = JSON.parse(e.data) as ServerEvent; setEvents((prev) => [...prev.slice(-99), event]); }); return () => { source.close(); setStatus('closed'); }; }, [url]); return { events, status }; }

When to Choose SSE

  • One-way data flow: server pushes, client listens
  • Notifications, activity feeds, progress bars
  • Works behind corporate proxies and CDNs without special config
  • Built-in browser reconnection with Last-Event-ID support
Result
  • Zero-dependency server-to-client streaming
  • Automatic reconnection handled by the browser
  • Works on every hosting provider without WebSocket support

#2 Streaming AI Responses Token-by-Token

The most visible real-time pattern in 2026 is AI response streaming. Users expect to see text appear word-by-word, not wait 5 seconds for a complete response. Next.js 16 Route Handlers and React 19 make this surprisingly clean.

Streaming Route Handler for AI

Most LLM providers return a ReadableStream. You can pipe it directly through a Route Handler.

app/api/chat/route.ts
import { streamText } from 'ai'; import { openai } from '@ai-sdk/openai'; export async function POST(req: Request) { const { messages } = await req.json(); const result = streamText({ model: openai('gpt-4o'), system: 'You are a helpful assistant.', messages, }); return result.toDataStreamResponse(); }

Consuming the Stream in React

On the client, read the stream chunk-by-chunk and update state as tokens arrive. The useChat hook from the AI SDK handles this, but here's the manual pattern for full control:

components/ChatStream.tsx
'use client'; import { useState, useCallback } from 'react'; export function ChatStream() { const [response, setResponse] = useState(''); const [isStreaming, setIsStreaming] = useState(false); const send = useCallback(async (message: string) => { setIsStreaming(true); setResponse(''); const res = await fetch('/api/chat', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ messages: [{ role: 'user', content: message }], }), }); const reader = res.body?.getReader(); const decoder = new TextDecoder(); if (!reader) return; while (true) { const { done, value } = await reader.read(); if (done) break; const chunk = decoder.decode(value, { stream: true }); setResponse((prev) => prev + chunk); } setIsStreaming(false); }, []); return ( <div> <div aria-live="polite" aria-busy={isStreaming}> {response || 'Ask me anything...'} </div> <button onClick={() => send('Explain streaming in Next.js')}> Send </button> </div> ); }

Key Architecture Decision

Keep AI streaming in Route Handlers, not Server Actions. Server Actions are designed for mutations (POST → revalidate → redirect). Streaming responses need a long-lived connection that Route Handlers provide naturally.

  • Route Handlers: streaming responses, SSE, long-lived connections
  • Server Actions: mutations, form submissions, cache invalidation
  • Never mix the two — each has a clear purpose
Result
  • Tokens appear instantly — no waiting for the full response
  • Clean separation between streaming (Route Handlers) and mutations (Server Actions)
  • Accessible with aria-live and aria-busy attributes

#3 WebSocket Integration Alongside the App Router

When you need bidirectional communication — multiplayer games, collaborative editing, real-time chat — SSE isn't enough. WebSockets give you a full-duplex channel. The challenge is integrating them cleanly with Next.js 16's server model.

Running a WebSocket Server Alongside Next.js

Next.js doesn't handle WebSocket upgrades natively. The cleanest pattern is running a lightweight WS server on a separate port (or path) in your custom server, or using a managed service like Ably, Pusher, or PartyKit.

server/ws.ts
import { WebSocketServer, WebSocket } from 'ws'; const wss = new WebSocketServer({ port: 3001 }); const rooms = new Map<string, Set<WebSocket>>(); wss.on('connection', (ws, req) => { const roomId = new URL(req.url!, 'http://localhost').searchParams.get('room') ?? 'default'; if (!rooms.has(roomId)) rooms.set(roomId, new Set()); rooms.get(roomId)!.add(ws); ws.on('message', (raw) => { const message = raw.toString(); // Broadcast to everyone else in the room for (const client of rooms.get(roomId)!) { if (client !== ws && client.readyState === WebSocket.OPEN) { client.send(message); } } }); ws.on('close', () => { rooms.get(roomId)?.delete(ws); if (rooms.get(roomId)?.size === 0) rooms.delete(roomId); }); }); console.log('WebSocket server running on ws://localhost:3001');

Type-Safe Client Hook

Wrap the raw WebSocket API in a hook that handles lifecycle, reconnection, and typed messages.

hooks/useWebSocket.ts
'use client'; import { useEffect, useRef, useState, useCallback } from 'react'; interface UseWebSocketOptions { url: string; onMessage?: (data: string) => void; reconnectDelay?: number; } export function useWebSocket({ url, onMessage, reconnectDelay = 2000 }: UseWebSocketOptions) { const wsRef = useRef<WebSocket | null>(null); const [status, setStatus] = useState<'connecting' | 'open' | 'closed'>('connecting'); const connect = useCallback(() => { const ws = new WebSocket(url); wsRef.current = ws; ws.onopen = () => setStatus('open'); ws.onclose = () => { setStatus('closed'); setTimeout(connect, reconnectDelay); }; ws.onmessage = (e) => onMessage?.(e.data); return ws; }, [url, onMessage, reconnectDelay]); useEffect(() => { const ws = connect(); return () => ws.close(); }, [connect]); const send = useCallback((data: string) => { if (wsRef.current?.readyState === WebSocket.OPEN) { wsRef.current.send(data); } }, []); return { send, status }; }

SSE vs WebSocket — Decision Matrix

  • SSE: server-to-client only, auto-reconnect, works behind all proxies, simpler
  • WebSocket: bidirectional, lower latency, requires separate server or managed service
  • Default to SSE unless you need the client to push data back to the server in real-time
  • For collaborative features or multiplayer, WebSocket is the right call
Result
  • Full-duplex communication for collaborative features
  • Room-based architecture for scoped broadcasts
  • Type-safe client hook with automatic reconnection

#4 Live Data with Polling and React 19 Transitions

Not everything needs a persistent connection. For dashboards that update every few seconds, smart polling with React 19 transitions gives you live-feeling UI without the complexity of SSE or WebSockets.

Polling with useTransition

React 19's useTransition lets you refetch data in the background without blocking the current UI. Combined with a polling interval, it creates a smooth live experience.

components/LiveDashboard.tsx
'use client'; import { useState, useEffect, useTransition } from 'react'; interface Metrics { activeUsers: number; requestsPerSecond: number; errorRate: number; p99Latency: number; } export function LiveDashboard() { const [metrics, setMetrics] = useState<Metrics | null>(null); const [isPending, startTransition] = useTransition(); useEffect(() => { async function poll() { const res = await fetch('/api/metrics'); const data = await res.json(); startTransition(() => setMetrics(data)); } poll(); // initial fetch const interval = setInterval(poll, 5_000); return () => clearInterval(interval); }, []); if (!metrics) return <div aria-busy="true">Loading metrics...</div>; return ( <div className={isPending ? 'opacity-80 transition-opacity' : ''}> <dl> <dt>Active Users</dt> <dd>{metrics.activeUsers.toLocaleString()}</dd> <dt>Requests/s</dt> <dd>{metrics.requestsPerSecond.toLocaleString()}</dd> <dt>Error Rate</dt> <dd>{(metrics.errorRate * 100).toFixed(2)}%</dd> <dt>p99 Latency</dt> <dd>{metrics.p99Latency}ms</dd> </dl> </div> ); }

When Polling Beats Streaming

  • Data changes every few seconds, not milliseconds
  • No server infrastructure for persistent connections
  • Edge/serverless deployment where long-lived connections aren't supported
  • Dashboard-style UIs where slight staleness is acceptable
Result
  • Live-feeling UI with zero connection management
  • Non-blocking updates via React 19 transitions
  • Deploys anywhere — no WebSocket server required

#5 Live Cursors and Presence

Collaborative features — seeing who's online, watching their cursor move in real-time — have become expected in tools like Figma, Notion, and Google Docs. Here's how to build lightweight presence on top of WebSockets.

Presence State Architecture

Each connected client broadcasts a compact presence object. The server merges state and rebroadcasts to the room. Throttle cursor updates to ~60ms to avoid flooding.

hooks/usePresence.ts
'use client'; import { useState, useEffect, useCallback, useRef } from 'react'; import { useWebSocket } from './useWebSocket'; interface Cursor { x: number; y: number; } interface PresenceUser { id: string; name: string; color: string; cursor: Cursor | null; lastSeen: number; } export function usePresence(roomId: string, user: { id: string; name: string; color: string }) { const [others, setOthers] = useState<Map<string, PresenceUser>>(new Map()); const lastSent = useRef(0); const { send, status } = useWebSocket({ url: `ws://localhost:3001?room=${roomId}`, onMessage: (raw) => { const msg = JSON.parse(raw); if (msg.type === 'presence') { setOthers((prev) => { const next = new Map(prev); next.set(msg.user.id, { ...msg.user, lastSeen: Date.now() }); return next; }); } if (msg.type === 'leave') { setOthers((prev) => { const next = new Map(prev); next.delete(msg.userId); return next; }); } }, }); const updateCursor = useCallback( (cursor: Cursor) => { const now = Date.now(); if (now - lastSent.current < 60) return; // throttle to ~16fps lastSent.current = now; send(JSON.stringify({ type: 'presence', user: { ...user, cursor }, })); }, [send, user] ); // Clean up stale users every 5s useEffect(() => { const cleanup = setInterval(() => { setOthers((prev) => { const next = new Map(prev); for (const [id, u] of next) { if (Date.now() - u.lastSeen > 10_000) next.delete(id); } return next; }); }, 5_000); return () => clearInterval(cleanup); }, []); return { others: Array.from(others.values()), updateCursor, status }; }

Rendering Live Cursors

components/LiveCursors.tsx
'use client'; import { usePresence } from '@/hooks/usePresence'; export function LiveCursors({ roomId }: { roomId: string }) { const { others, updateCursor } = usePresence(roomId, { id: 'user-1', name: 'Alice', color: '#3b82f6', }); return ( <div className="relative h-full w-full" onPointerMove={(e) => updateCursor({ x: e.clientX, y: e.clientY })} > {others.map((user) => user.cursor ? ( <div key={user.id} className="pointer-events-none absolute z-50 transition-transform duration-75" style={{ transform: `translate(${user.cursor.x}px, ${user.cursor.y}px)`, }} > <svg width="16" height="16" viewBox="0 0 16 16" fill={user.color}> <path d="M0 0L16 6L8 8L6 16Z" /> </svg> <span className="ml-4 -mt-1 rounded px-1.5 py-0.5 text-xs text-white whitespace-nowrap" style={{ backgroundColor: user.color }} > {user.name} </span> </div> ) : null )} </div> ); }

Performance Considerations

  • Throttle cursor broadcasts to 60ms (~16fps) — human eyes can't tell the difference
  • Use CSS transforms instead of top/left for GPU-accelerated cursor movement
  • Clean up stale presence entries to avoid memory leaks
  • Keep the presence payload small: id, name, color, cursor coordinates — nothing more
Result
  • Smooth collaborative cursors at 16fps with minimal bandwidth
  • Automatic stale user cleanup prevents ghost cursors
  • Composable hook pattern — reuse for typing indicators, selections, and more

#6 Resilient Reconnection Patterns

Real-time connections drop. Networks switch from Wi-Fi to cellular. Laptops wake from sleep. Servers redeploy. Your app needs to handle all of this gracefully — without the user noticing.

Exponential Backoff with Jitter

Naive reconnection (retry every 2 seconds) creates thundering herds when a server restarts and thousands of clients reconnect simultaneously. Exponential backoff with jitter spreads reconnections over time.

lib/reconnect.ts
interface ReconnectOptions { baseDelay?: number; maxDelay?: number; maxRetries?: number; onConnect: () => void; onDisconnect: () => void; onMaxRetries?: () => void; } export function createReconnect(connect: () => WebSocket | EventSource, options: ReconnectOptions) { const { baseDelay = 1000, maxDelay = 30_000, maxRetries = Infinity, onConnect, onDisconnect, onMaxRetries, } = options; let retries = 0; let instance: WebSocket | EventSource | null = null; function attempt() { if (retries >= maxRetries) { onMaxRetries?.(); return; } instance = connect(); if (instance instanceof WebSocket) { instance.onopen = () => { retries = 0; // reset on success onConnect(); }; instance.onclose = () => { onDisconnect(); scheduleRetry(); }; } } function scheduleRetry() { // Exponential backoff: 1s, 2s, 4s, 8s... capped at maxDelay const delay = Math.min(baseDelay * 2 ** retries, maxDelay); // Add ±25% jitter to prevent thundering herd const jitter = delay * (0.75 + Math.random() * 0.5); retries++; setTimeout(attempt, jitter); } function disconnect() { retries = maxRetries; // prevent auto-reconnect instance?.close(); } attempt(); return { disconnect }; }

Connection Status UI

Always communicate connection status to the user. A subtle indicator is better than silent failures.

components/ConnectionBadge.tsx
'use client'; const statusConfig = { connecting: { label: 'Reconnecting...', color: 'bg-amber-500', pulse: true, }, open: { label: 'Connected', color: 'bg-emerald-500', pulse: false, }, closed: { label: 'Offline', color: 'bg-red-500', pulse: false, }, } as const; export function ConnectionBadge({ status }: { status: 'connecting' | 'open' | 'closed' }) { const config = statusConfig[status]; return ( <div className="flex items-center gap-2 text-xs" role="status" aria-live="polite"> <span className={`h-2 w-2 rounded-full ${config.color} ${config.pulse ? 'animate-pulse' : ''}`} /> {config.label} </div> ); }

Offline Queue for Mutations

When the connection drops, queue outgoing messages and flush them on reconnect. This prevents data loss during brief network interruptions.

lib/offline-queue.ts
export class OfflineQueue<T> { private queue: T[] = []; private isOnline = true; enqueue(item: T, send: (item: T) => void) { if (this.isOnline) { send(item); } else { this.queue.push(item); } } flush(send: (item: T) => void) { while (this.queue.length > 0) { send(this.queue.shift()!); } } setOnline(online: boolean, send: (item: T) => void) { this.isOnline = online; if (online) this.flush(send); } }
Result
  • Connections recover silently without user intervention
  • Exponential backoff with jitter prevents server overload on reconnect storms
  • Offline queue ensures zero data loss during brief disconnections
  • Clear status indicators keep users informed

Choosing the Right Pattern

There's no single “right” real-time solution. The best choice depends on your data flow, infrastructure, and user expectations:

  • Notifications, live feeds, progress → SSE (simplest, most reliable)
  • AI token streaming → Route Handler + ReadableStream
  • Chat, collaboration, multiplayer → WebSockets
  • Dashboards with 5-30s refresh → Polling with useTransition
  • Cursors, presence, typing indicators → WebSocket + throttled broadcasts

Start with the simplest pattern that meets your needs. You can always upgrade from polling to SSE to WebSockets as requirements grow—the hook abstractions make swapping the transport layer straightforward.

Production Checklist

Before shipping real-time features:

  • Implement reconnection with exponential backoff and jitter
  • Add heartbeats to detect silent connection drops
  • Queue mutations during offline periods and flush on reconnect
  • Throttle high-frequency updates (cursors, typing) to ≤16fps
  • Show connection status to the user — never fail silently
  • Test with network throttling and sudden disconnects in DevTools
  • Monitor connection counts and message throughput in production

Key Takeaways

  • 1Use SSE for server-to-client streaming — it's simpler than WebSockets and reconnects automatically
  • 2Stream AI responses through Route Handlers, not Server Actions
  • 3WebSockets are for bidirectional communication — collaborative editing, chat, multiplayer
  • 4Smart polling with React 19 transitions is a legitimate real-time strategy for dashboards
  • 5Always implement exponential backoff with jitter for reconnection
  • 6Show connection status and queue offline mutations — never let real-time failures go silent

Related Reading