WebSocket vs SSE: Choosing the Right Tool
WebSocket (RFC 6455) provides bidirectional, full-duplex communication. SSE (Server-Sent Events, part of the HTML Living Standard) provides unidirectional, server-to-client streaming. If you chose WebSocket but your use case is actually unidirectional, you are carrying unnecessary complexity.
When SSE Is the Better Choice
| Use Case | WebSocket | SSE | Why |
|---|---|---|---|
| Live dashboard / metrics | Overkill | Ideal | Server pushes, client reads |
| News feed / activity stream | Overkill | Ideal | Append-only server stream |
| Notifications | Overkill | Ideal | Server → client only |
| AI response streaming | Overkill | Ideal | Token-by-token server output |
| Live chat | Required | Inadequate | Bidirectional text |
| Multiplayer games | Required | Inadequate | Low-latency bidirectional |
| Collaborative editing | Required | Inadequate | Bidirectional synchronization |
SSE Advantages Over WebSocket
- Works over HTTP/1.1 and HTTP/2 — no protocol upgrade required
- Automatic reconnection — built into the
EventSourceAPI - Works through HTTP proxies and CDNs — Nginx, Cloudflare, AWS CloudFront
- Standard HTTP authentication — cookies, Authorization headers work normally
- No special load balancer config — no sticky sessions, no WebSocket proxy mode
- Simpler backend implementation — response streaming vs WebSocket frame handling
When to Keep WebSocket
Keep WebSocket for true bidirectional needs: chat, gaming, collaborative editing, terminal emulation, VoIP signaling, or any case where the client sends frequent messages that are not just occasional POST requests.
Step 1: Architecture Changes
Removing the WebSocket Upgrade
WebSocket upgrades from HTTP with a handshake:
# WebSocket: requires upgrade
GET /ws/updates HTTP/1.1
Upgrade: websocket
Connection: Upgrade
Sec-WebSocket-Key: dGhlIHNhbXBsZSBub25jZQ==
# SSE: plain HTTP GET with persistent response
GET /events HTTP/1.1
Accept: text/event-stream
SSE Response Format
The server sends events as plain text with a specific format:
HTTP/1.1 200 OK
Content-Type: text/event-stream
Cache-Control: no-cache
X-Accel-Buffering: no # Disable Nginx response buffering
# Event format (each field ends with \n, event ends with \n\n):
id: 1234
event: order-update
data: {"orderId": "abc", "status": "shipped"}
id: 1235
event: notification
data: {"message": "Your package is out for delivery"}
# Heartbeat (keep connection alive, prevent proxy timeout):
: ping
Backend Implementation
# FastAPI SSE endpoint:
from fastapi import FastAPI, Request
from fastapi.responses import StreamingResponse
import asyncio
import json
app = FastAPI()
async def event_generator(request: Request, user_id: str):
"""Yield SSE events for a user's real-time updates."""
event_id = 0
while True:
# Check if client disconnected:
if await request.is_disconnected():
break
# Fetch pending events from Redis or database:
events = await get_pending_events(user_id)
for event in events:
event_id += 1
payload = (
f'id: {event_id}\n'
f'event: {event["type"]}\n'
f'data: {json.dumps(event["data"])}\n\n'
)
yield payload
# Heartbeat every 15 seconds:
yield ': ping\n\n'
await asyncio.sleep(15)
@app.get('/events')
async def sse_endpoint(request: Request, user_id: str) -> StreamingResponse:
return StreamingResponse(
event_generator(request, user_id),
media_type='text/event-stream',
headers={
'Cache-Control': 'no-cache',
'X-Accel-Buffering': 'no',
},
)
EventSource API (Browser Client)
// Replace WebSocket client code:
// const ws = new WebSocket('wss://your-domain.com/ws/updates');
// ws.onmessage = (event) => { ... };
// With EventSource:
const es = new EventSource('/events?user_id=42', {
withCredentials: true, // Send cookies (for auth)
});
// Listen for specific event types:
es.addEventListener('order-update', (event) => {
const data = JSON.parse(event.data);
updateOrderStatus(data.orderId, data.status);
});
es.addEventListener('notification', (event) => {
const data = JSON.parse(event.data);
showNotification(data.message);
});
// Generic handler for unnamed events:
es.onmessage = (event) => {
console.log('Event:', event.data);
};
// Error handling:
es.onerror = (error) => {
// EventSource automatically reconnects — this fires during reconnection attempts
console.warn('SSE error, will retry:', error);
};
// Cleanup when done:
es.close();
Step 2: Handling Bidirectional Needs
The most common concern: "I need the client to send data too." The SSE hybrid pattern handles this cleanly: use SSE for server→client and regular HTTP POST for client→server.
// Client sends a message via POST, server streams response via SSE:
async function sendChatMessage(text) {
await fetch('/api/messages', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ text }),
});
// Response comes back through the open SSE connection
}
This pattern works well for: chat (POST message, SSE delivery), collaborative editing (POST operation, SSE broadcast), command/response (POST command, SSE result stream). It does not work for latency-critical gaming (10ms round-trip requirement) or VoIP.
Step 3: Reconnection and Event Replay
Automatic Reconnection
The EventSource API reconnects automatically after a disconnect. The browser sends the Last-Event-ID header to tell the server which event it last received:
GET /events HTTP/1.1
Last-Event-ID: 1234 <- Browser sends this on reconnect
Your server uses this to replay missed events:
from fastapi import Header
@app.get('/events')
async def sse_endpoint(
request: Request,
user_id: str,
last_event_id: str | None = Header(None),
) -> StreamingResponse:
return StreamingResponse(
event_generator(request, user_id, resume_from=last_event_id),
media_type='text/event-stream',
)
async def event_generator(request: Request, user_id: str, resume_from: str | None):
# Replay missed events if client reconnected:
if resume_from:
missed = await get_events_after(user_id, event_id=int(resume_from))
for event in missed:
yield format_sse_event(event)
# Continue with live events...
async for event in subscribe_to_live_events(user_id):
if await request.is_disconnected():
break
yield format_sse_event(event)
Control reconnection interval with the retry field:
retry: 5000\n\n # Browser waits 5 seconds before reconnecting
Step 4: Deployment and Load Balancer Configuration
Nginx Configuration
The critical change: disable response buffering. Nginx buffers responses by default, which breaks SSE (events are queued and sent in batches instead of immediately).
location /events {
proxy_pass http://backend;
proxy_http_version 1.1; # Required for keep-alive
proxy_buffering off; # Critical: send events immediately
proxy_cache off;
proxy_read_timeout 3600s; # Allow long-lived connections
proxy_set_header Connection ''; # Remove Connection: upgrade
# WebSocket location that you are replacing — can be removed:
# (or kept temporarily for backward compatibility)
}
No Sticky Sessions Required
Unlike WebSocket, SSE connections do not maintain mutable in-memory state per connection. Events come from a shared store (Redis Pub/Sub, database polling, message queue). Any backend instance can handle any client connection — standard round-robin load balancing works correctly.
HTTP/2 Connection Multiplexing
With HTTP/2, multiple SSE streams share a single TCP connection via multiplexing. A browser limited to 6 HTTP/1.1 connections per host can open hundreds of SSE streams over HTTP/2 without running out of connections:
HTTP/1.1: Each SSE stream = 1 of 6 available connections (problematic)
HTTP/2: All SSE streams multiplexed over 1 connection (unlimited)
Ensure your Nginx and backend both support HTTP/2, and your SSL termination is in place (HTTP/2 requires TLS in all major browser implementations).