Runtime Quickstart
Build your first Python agent
Build and run your first agent in minutes.
Create an Agent
# agent.py
from reminix.runtime import Agent, serve
# Create an agent with optional metadata
agent = Agent(
"my-agent",
metadata={
"framework": "custom",
"model": "gpt-4",
"description": "My first agent",
},
)
# Handle invoke requests
@agent.invoke
async def handle_invoke(input_data: dict) -> dict:
name = input_data.get("name", "World")
return {"output": f"Hello, {name}!"}
# Handle chat requests
@agent.chat
async def handle_chat(messages: list) -> dict:
last_message = messages[-1]["content"] if messages else ""
return {
"message": {
"role": "assistant",
"content": f"You said: {last_message}"
}
}
if __name__ == "__main__":
serve(agent, port=8080)Run
python agent.pyOutput:
Starting Reminix Agent Runtime...
Agents: my-agent
Listening on http://0.0.0.0:8080Test
# Health check
curl http://localhost:8080/health
# Invoke
curl -X POST http://localhost:8080/agent/my-agent/invoke \
-H "Content-Type: application/json" \
-d '{"input": {"name": "World"}, "stream": false}'
# Chat
curl -X POST http://localhost:8080/agent/my-agent/chat \
-H "Content-Type: application/json" \
-d '{"messages": [{"role": "user", "content": "Hello!"}], "stream": false}'Key Concepts
Agent Class
agent = Agent("my-agent")
# With optional metadata
agent = Agent(
"my-agent",
metadata={
"framework": "langchain",
"model": "gpt-4",
"description": "Customer support agent",
},
)Creates an agent with the given name. Names must be lowercase with only letters, numbers, hyphens, and underscores.
The optional metadata parameter allows you to attach information that will be available via the /_discover endpoint and displayed in the Reminix dashboard.
Invoke Handler
@agent.invoke
async def handle_invoke(input_data: dict, ctx: Context) -> dict:
# ctx has: conversation_id, user_id, custom
return {"output": "result"}Processes structured input and returns output. The ctx parameter is optional.
Chat Handler
@agent.chat
async def handle_chat(messages: list, ctx: Context) -> dict:
# Access context for conversation tracking
conv_id = ctx.get("conversation_id")
return {"message": {"role": "assistant", "content": "response"}}Processes chat messages and returns an assistant message. The ctx parameter is optional.
Context
The context object contains request information:
conversation_id: Track multi-turn conversationsuser_id: Identify the usercustom: Any additional custom fields
Streaming Handlers
Use separate handlers for streaming responses:
@agent.invoke_stream
async def handle_invoke_stream(input_data: dict, ctx: Context):
yield {"chunk": "Hello "}
yield {"chunk": "World!"}
@agent.chat_stream
async def handle_chat_stream(messages: list, ctx: Context):
for word in ["Hello", "!", " How can I help?"]:
yield {"chunk": word}The server routes to the appropriate handler based on the client's stream flag:
stream: false→ uses@agent.invoke/@agent.chatstream: true→ uses@agent.invoke_stream/@agent.chat_stream
Serve Function
serve(agent, port=8080)
# or multiple agents
serve([agent1, agent2], port=8080)Starts the HTTP server.
Next Steps
- Streaming — Stream responses
- Testing — Test agents locally
- Examples — See more examples
- Deploy to Reminix — Deploy your agent