Streaming delivers agent output as it’s produced, instead of waiting for the full response. Reminix uses Server-Sent Events (SSE) for incremental updates to the client.
Streaming works with all three interaction patterns: tasks , conversations , and workflows .
Enabling streaming
Add stream: true to any request:
{
"input" : { "prompt" : "Write a poem about AI" },
"stream" : true
}
The server responds with Content-Type: text/event-stream and sends events as the agent generates output.
Stream events
Event Description Fields text_deltaIncremental text chunk delta: stringtool_callTool invocation started tool_call: { id, type, function }tool_resultTool execution result tool_call_id: string, output: stringmessageComplete message message: MessagestepWorkflow step update name, status, output?, pendingAction?[DONE]Stream complete —
The raw event stream looks like this:
data: {"type": "text_delta", "delta": "The "}
data: {"type": "text_delta", "delta": "weather "}
data: {"type": "text_delta", "delta": "is sunny."}
data: {"type": "tool_call", "tool_call": {"id": "call_1", "type": "function", "function": {"name": "get_weather", "arguments": "{\"location\": \"SF\"}"}}}
data: {"type": "tool_result", "tool_call_id": "call_1", "output": "{\"temp\": 72}"}
data: [DONE]
Each line is prefixed with data: followed by a JSON object (except [DONE]). Events are separated by double newlines per the SSE specification.
Using the SDK
The SDK handles SSE parsing and exposes streams as async iterables.
const stream = await client . agents . invoke ( "my-agent" , {
input: { prompt: "Write a poem" },
stream: true ,
})
for await ( const event of stream ) {
switch ( event . type ) {
case "text_delta" :
process . stdout . write ( event . delta )
break
case "tool_call" :
console . log ( "Calling tool:" , event . tool_call . function . name )
break
case "tool_result" :
console . log ( "Tool result:" , event . output )
break
}
}
Streaming conversations
Streaming works the same way with the chat endpoint:
const stream = await client . agents . chat ( "support-bot" , {
messages: [{ role: "user" , content: "Help me reset my password" }],
stream: true ,
})
for await ( const event of stream ) {
if ( event . type === "text_delta" ) {
process . stdout . write ( event . delta )
}
}
Streaming workflows
Workflow streams include step events in addition to text deltas, so you can track progress through the workflow in real time.
const stream = await client . agents . workflow ( "lead-router" , {
input: { task: "Route this lead" , state: { leadId: "lead-123" } },
stream: true ,
})
for await ( const event of stream ) {
switch ( event . type ) {
case "step" :
console . log ( `Step " ${ event . name } ": ${ event . status } ` )
break
case "text_delta" :
process . stdout . write ( event . delta )
break
}
}
Runtime side (async generators)
Agents produce streams by yielding from async generators. Raw strings are automatically wrapped as text_delta events.
import { agent , serve } from "@reminix/runtime"
const bot = agent ( "streamer" , {
type: "prompt" ,
stream: true ,
handler : async function* ( input ) {
yield "Hello "
yield "world!"
},
})
serve ({ agents: [ bot ] })
You can also yield StreamEvent objects for finer control over the stream. This lets you emit tool calls, step updates, and other structured events directly from your handler.
Error handling
If an agent throws an error during streaming, the error is sent as a final event before the stream closes:
data: {"type": "error", "error": {"type": "ExecutionError", "message": "Agent threw an error"}}
data: [DONE]
Handle this in your client by checking for error events:
for await ( const event of stream ) {
if ( event . type === "error" ) {
console . error ( "Stream error:" , event . error . message )
break
}
// handle other events...
}
Streaming is recommended for any user-facing agent interaction. It provides a much better experience since users see output as it is generated rather than waiting for the full response.
Next steps
TypeScript: invoke() Stream task agents with async iterators.
Python: chat() Stream chat agents with async generators.
Tasks Single-shot invocations that can also stream.
Conversations Multi-turn chat with streaming responses.