Runtime Quickstart
Build your first TypeScript agent
Build and run your first agent in minutes.
Create an Agent
// agent.ts
import { Agent, serve } from '@reminix/runtime';
// Create an agent with optional metadata
const agent = new Agent('my-agent', {
metadata: {
framework: 'custom',
model: 'gpt-4',
description: 'My first agent',
},
});
// Handle invoke requests
agent.onInvoke(async (input) => {
const name = (input.name as string) ?? 'World';
return { output: `Hello, ${name}!` };
});
// Handle chat requests
agent.onChat(async (messages) => {
const lastMessage = messages[messages.length - 1];
const content = typeof lastMessage?.content === 'string' ? lastMessage.content : '';
return {
message: {
role: 'assistant',
content: `You said: ${content}`,
},
};
});
serve(agent, { port: 8080 });Run
npx tsx agent.tsOutput:
Starting Reminix Agent Runtime...
Agents: my-agent
Listening on http://0.0.0.0:8080Test
# Health check
curl http://localhost:8080/health
# Invoke
curl -X POST http://localhost:8080/agent/my-agent/invoke \
-H "Content-Type: application/json" \
-d '{"input": {"name": "World"}, "stream": false}'
# Chat
curl -X POST http://localhost:8080/agent/my-agent/chat \
-H "Content-Type: application/json" \
-d '{"messages": [{"role": "user", "content": "Hello!"}], "stream": false}'Key Concepts
Agent Class
const agent = new Agent('my-agent');
// With optional metadata
const agent = new Agent('my-agent', {
metadata: {
framework: 'langchain',
model: 'gpt-4',
description: 'Customer support agent',
},
});Creates an agent with the given name. Names must be lowercase with only letters, numbers, hyphens, and underscores.
The optional metadata parameter allows you to attach information that will be available via the /_discover endpoint and displayed in the Reminix dashboard.
Invoke Handler
agent.onInvoke(async (input, ctx) => {
// ctx has: conversationId, userId, custom
return { output: 'result' };
});Processes structured input and returns output. The ctx parameter is optional.
Chat Handler
agent.onChat(async (messages, ctx) => {
// Access context for conversation tracking
const convId = ctx?.conversationId;
return { message: { role: 'assistant', content: 'response' } };
});Processes chat messages and returns an assistant message. The ctx parameter is optional.
Context
The context object contains request information:
conversationId: Track multi-turn conversationsuserId: Identify the usercustom: Any additional custom fields
Streaming Handlers
Use separate handlers for streaming responses:
agent.onInvokeStream(async function* (input, ctx) {
yield { chunk: 'Hello ' };
yield { chunk: 'World!' };
});
agent.onChatStream(async function* (messages, ctx) {
for (const word of ['Hello', '!', ' How can I help?']) {
yield { chunk: word };
}
});The server routes to the appropriate handler based on the client's stream flag:
stream: false→ usesonInvoke/onChatstream: true→ usesonInvokeStream/onChatStream
Serve Function
serve(agent, { port: 8080 });
// or multiple agents
serve([agent1, agent2], { port: 8080 });Starts the HTTP server.
Next Steps
- Streaming — Stream responses
- Testing — Test agents locally
- Examples — See more examples
- Deploy to Reminix — Deploy your agent