How to Use Server-Sent Events (SSE): Complete Guide (2026)
Real-time streaming from server to client, explained
Start Building with Hypereal
Access Kling, Flux, Sora, Veo & more through a single API. Free credits to start, scale to millions.
No credit card required • 100k+ developers • Enterprise ready
How to Use Server-Sent Events (SSE): Complete Guide (2026)
Server-Sent Events (SSE) provide a simple, standardized way for servers to push real-time updates to web clients over a single HTTP connection. Unlike WebSockets, SSE is unidirectional (server to client only), uses plain HTTP, works through proxies and firewalls without special configuration, and automatically reconnects on disconnection. This makes SSE the ideal choice for live feeds, AI token streaming, notifications, and dashboards.
SSE vs WebSockets vs Long Polling
Before diving into implementation, it helps to understand when SSE is the right choice:
| Feature | SSE | WebSocket | Long Polling |
|---|---|---|---|
| Direction | Server to client | Bidirectional | Server to client |
| Protocol | HTTP | WS/WSS | HTTP |
| Auto-reconnect | Built-in | Manual | Manual |
| Binary data | No (text only) | Yes | Yes |
| Browser support | All modern browsers | All modern browsers | All browsers |
| Proxy-friendly | Yes | Sometimes problematic | Yes |
| Connection overhead | Low (single HTTP) | Low (single TCP) | High (repeated HTTP) |
| Best for | Feeds, notifications, AI streaming | Chat, games, collaboration | Legacy compatibility |
Use SSE when you only need server-to-client updates, want automatic reconnection, and prefer simplicity over bidirectional communication.
How SSE Works
The SSE protocol is remarkably simple:
- The client makes a standard HTTP GET request with
Accept: text/event-stream - The server responds with
Content-Type: text/event-streamand keeps the connection open - The server sends events as plain text, each separated by two newlines
- If the connection drops, the browser automatically reconnects
SSE Message Format
Each SSE message consists of one or more fields, each on its own line:
event: message_type
id: unique_id_123
retry: 5000
data: {"text": "Hello, world"}
The fields are:
data:-- The message payload (required). Multipledata:lines are joined with newlines.event:-- A named event type (optional). Defaults to"message".id:-- A unique event ID (optional). Used for reconnection.retry:-- Reconnection interval in milliseconds (optional).
A blank line (double newline \n\n) signals the end of an event.
Server Implementation
Node.js (Express)
const express = require("express");
const app = express();
app.get("/events", (req, res) => {
// Set SSE headers
res.setHeader("Content-Type", "text/event-stream");
res.setHeader("Cache-Control", "no-cache");
res.setHeader("Connection", "keep-alive");
res.setHeader("Access-Control-Allow-Origin", "*");
// Flush headers immediately
res.flushHeaders();
// Send an event every 2 seconds
let counter = 0;
const interval = setInterval(() => {
counter++;
const data = JSON.stringify({
time: new Date().toISOString(),
count: counter
});
res.write(`id: ${counter}\n`);
res.write(`event: tick\n`);
res.write(`data: ${data}\n\n`);
}, 2000);
// Clean up on client disconnect
req.on("close", () => {
clearInterval(interval);
res.end();
console.log("Client disconnected");
});
});
app.listen(3000, () => {
console.log("SSE server running on http://localhost:3000");
});
Python (FastAPI)
from fastapi import FastAPI
from fastapi.responses import StreamingResponse
import asyncio
import json
from datetime import datetime
app = FastAPI()
async def event_generator():
counter = 0
while True:
counter += 1
data = json.dumps({
"time": datetime.now().isoformat(),
"count": counter
})
yield f"id: {counter}\nevent: tick\ndata: {data}\n\n"
await asyncio.sleep(2)
@app.get("/events")
async def stream_events():
return StreamingResponse(
event_generator(),
media_type="text/event-stream",
headers={
"Cache-Control": "no-cache",
"Connection": "keep-alive",
}
)
Python (Flask)
from flask import Flask, Response
import json
import time
from datetime import datetime
app = Flask(__name__)
def generate_events():
counter = 0
while True:
counter += 1
data = json.dumps({
"time": datetime.now().isoformat(),
"count": counter
})
yield f"id: {counter}\nevent: tick\ndata: {data}\n\n"
time.sleep(2)
@app.route("/events")
def stream():
return Response(
generate_events(),
mimetype="text/event-stream",
headers={
"Cache-Control": "no-cache",
"X-Accel-Buffering": "no" # Disable nginx buffering
}
)
Client Implementation
Browser (EventSource API)
The browser provides a built-in EventSource API that handles connection management, automatic reconnection, and event parsing:
// Connect to the SSE endpoint
const eventSource = new EventSource("http://localhost:3000/events");
// Listen for the default "message" event
eventSource.onmessage = (event) => {
console.log("Message:", event.data);
};
// Listen for named events
eventSource.addEventListener("tick", (event) => {
const data = JSON.parse(event.data);
console.log(`Tick #${data.count} at ${data.time}`);
});
// Handle connection open
eventSource.onopen = () => {
console.log("Connection established");
};
// Handle errors and reconnection
eventSource.onerror = (error) => {
console.error("SSE error:", error);
if (eventSource.readyState === EventSource.CLOSED) {
console.log("Connection was closed");
} else {
console.log("Reconnecting...");
}
};
// Close the connection when done
// eventSource.close();
React Hook for SSE
import { useEffect, useState, useCallback } from "react";
interface SSEOptions {
url: string;
eventName?: string;
onError?: (error: Event) => void;
}
function useSSE<T>(options: SSEOptions) {
const [data, setData] = useState<T | null>(null);
const [isConnected, setIsConnected] = useState(false);
useEffect(() => {
const eventSource = new EventSource(options.url);
eventSource.onopen = () => setIsConnected(true);
eventSource.onerror = (error) => {
setIsConnected(false);
options.onError?.(error);
};
const handler = (event: MessageEvent) => {
try {
const parsed = JSON.parse(event.data) as T;
setData(parsed);
} catch {
setData(event.data as unknown as T);
}
};
if (options.eventName) {
eventSource.addEventListener(options.eventName, handler);
} else {
eventSource.onmessage = handler;
}
return () => {
eventSource.close();
};
}, [options.url, options.eventName]);
return { data, isConnected };
}
// Usage in a component
function LiveDashboard() {
const { data, isConnected } = useSSE<{ time: string; count: number }>({
url: "/events",
eventName: "tick",
});
return (
<div>
<p>Status: {isConnected ? "Connected" : "Reconnecting..."}</p>
{data && (
<p>Count: {data.count} | Time: {data.time}</p>
)}
</div>
);
}
Real-World Use Case: AI Token Streaming
One of the most common uses of SSE in 2026 is streaming AI-generated tokens from an LLM API. Here is how to implement it:
Server (Proxy to AI API)
app.post("/api/chat", async (req, res) => {
res.setHeader("Content-Type", "text/event-stream");
res.setHeader("Cache-Control", "no-cache");
res.setHeader("Connection", "keep-alive");
res.flushHeaders();
try {
const aiResponse = await fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: {
"Authorization": `Bearer ${process.env.OPENAI_API_KEY}`,
"Content-Type": "application/json"
},
body: JSON.stringify({
model: "gpt-4o",
messages: req.body.messages,
stream: true
})
});
const reader = aiResponse.body.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) break;
const chunk = decoder.decode(value);
// Forward the SSE data directly
res.write(chunk);
}
res.write("data: [DONE]\n\n");
res.end();
} catch (error) {
res.write(`event: error\ndata: ${JSON.stringify({ error: error.message })}\n\n`);
res.end();
}
});
Client (Streaming with fetch)
For POST requests (which EventSource does not support), use the Fetch API with a readable stream:
async function streamChat(messages) {
const response = await fetch("/api/chat", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ messages })
});
const reader = response.body.getReader();
const decoder = new TextDecoder();
let buffer = "";
while (true) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split("\n");
buffer = lines.pop() || "";
for (const line of lines) {
if (line.startsWith("data: ")) {
const data = line.slice(6);
if (data === "[DONE]") return;
try {
const parsed = JSON.parse(data);
const token = parsed.choices?.[0]?.delta?.content;
if (token) {
document.getElementById("output").textContent += token;
}
} catch {
// Skip non-JSON lines
}
}
}
}
}
Troubleshooting Common Issues
| Problem | Cause | Solution |
|---|---|---|
| Events arrive in batches | Proxy buffering (nginx, CloudFlare) | Add X-Accel-Buffering: no header; disable proxy buffering |
| Connection drops after 60s | Server or proxy timeout | Send a comment line (:keepalive\n\n) every 30 seconds |
| No automatic reconnect | Using fetch instead of EventSource |
Use EventSource for GET requests; implement manual retry for POST |
| CORS errors | Missing headers | Add Access-Control-Allow-Origin on the server |
| Duplicate events on reconnect | No event IDs | Include id: field; track Last-Event-ID header on server |
Keep-Alive Pattern
To prevent proxy timeouts, send periodic comment lines:
// Server-side keep-alive
const keepAlive = setInterval(() => {
res.write(": keepalive\n\n");
}, 30000);
req.on("close", () => {
clearInterval(keepAlive);
});
Best Practices
- Always include event IDs so clients can resume from where they left off after reconnection.
- Set a reasonable retry interval using the
retry:field (default is typically 3 seconds). - Send keep-alive comments every 15-30 seconds to prevent proxy timeouts.
- Use named events to distinguish between different message types on the same stream.
- Handle backpressure by checking if the client is still connected before writing.
- Limit concurrent connections per user to avoid resource exhaustion.
Conclusion
Server-Sent Events offer a clean, standards-based approach to real-time server-to-client communication. They are simpler than WebSockets for unidirectional streaming, and the built-in browser EventSource API handles reconnection automatically. With the rise of AI streaming APIs, SSE has become an essential tool in every developer's toolkit.
If you are building applications that stream AI-generated content -- such as real-time video generation status updates, progressive image rendering, or live transcription feeds -- Hypereal AI provides streaming-compatible APIs for video generation, talking avatars, and image synthesis with built-in SSE support for tracking generation progress.
Related Articles
Start Building Today
Get 35 free credits on signup. No credit card required. Generate your first image in under 5 minutes.
