Streaming
Stream responses with async generators
Stream responses to clients using TypeScript async generators.
How It Works
Use dedicated streaming handlers with yield to stream chunks:
agent.onInvokeStream(async function* (input) {
yield { chunk: 'Hello ' };
yield { chunk: 'World!' };
});The runtime automatically:
- Routes
stream: truerequests to the streaming handler - Converts chunks to Server-Sent Events (SSE)
- Sends
data: {"chunk": "..."}for each yield - Sends
data: [DONE]when complete
Streaming Invoke
import { Agent, serve } from '@reminix/runtime';
const agent = new Agent('streamer');
agent.onInvokeStream(async function* (input) {
const text = (input.text as string) ?? 'Hello world';
const words = text.split(' ');
for (const word of words) {
yield { chunk: word + ' ' };
await sleep(100); // Simulate processing
}
});
function sleep(ms: number): Promise<void> {
return new Promise((resolve) => setTimeout(resolve, ms));
}
serve(agent, { port: 8080 });Streaming Chat
agent.onChatStream(async function* (messages) {
const lastMessage = messages[messages.length - 1];
const content = typeof lastMessage?.content === 'string' ? lastMessage.content : '';
const response = `You said: ${content}`;
for (const char of response) {
yield { chunk: char };
await sleep(20); // Simulate typing
}
});Handler Routing
The server routes to the appropriate handler based on the client's stream flag:
| Client Request | Handler Used |
|---|---|
stream: false | onInvoke / onChat |
stream: true | onInvokeStream / onChatStream |
Testing Streams
Request with stream: true:
curl -X POST http://localhost:8080/agent/streamer/invoke \
-H "Content-Type: application/json" \
-d '{"input": {"text": "Hello world"}, "stream": true}'Response:
data: {"chunk": "Hello "}
data: {"chunk": "world "}
data: [DONE]Next Steps
- Testing — Test agents locally
- Multi-Agent Example — Run multiple agents