Skip to main content

Basic Agent

Define an agent with the @agent decorator and call serve() to start a server.
from reminix_runtime import agent, serve

@agent
async def hello(prompt: str) -> str:
    """A simple greeting agent."""
    return f"Hello! You said: {prompt}"

serve(agents=[hello])
When deployed, your agent is available at POST /v1/agents/hello/invoke.

The @agent Decorator

name
str
Agent name. Defaults to the function name.
description
str
Agent description. Defaults to the first paragraph of the function’s docstring.
type
AgentType
One of: prompt, chat, task, thread, workflow. When set, the agent uses predefined input/output schemas for that type.
tags
list[str]
Tags for filtering and organization.
metadata
dict
Additional metadata as key-value pairs.

Agent Types

Every agent has a type that determines its input/output contract. Choose the type that matches your use case.

Prompt Agent (default)

Single prompt in, string out. Best for one-shot text generation tasks.
@agent(type="prompt")
async def summarizer(prompt: str) -> str:
    """Summarize text."""
    # prompt is a string, return a string
    return f"Summary of: {prompt}"

Chat Agent

Multi-turn conversations with message history.
@agent(name="support-bot", type="chat")
async def support_bot(messages: list, context=None) -> str:
    """Customer support chatbot."""
    last_message = messages[-1]["content"]
    return f"I can help with: {last_message}"

Task Agent

Structured input and output for data processing tasks.
@agent(name="analyzer", type="task")
async def analyzer(task: str, **kwargs) -> dict:
    """Analyze sentiment of text."""
    return {"sentiment": "positive", "confidence": 0.95}

Thread Agent

Full message history management — returns the entire conversation thread.
@agent(name="assistant", type="thread")
async def assistant(messages: list, context=None) -> list:
    """Returns full message history."""
    return messages + [{"role": "assistant", "content": "Response"}]

Workflow Agent

Multi-step workflows with pause/resume support and human-in-the-loop actions.
@agent(name="lead-router", type="workflow")
async def lead_router(task: str, state=None, resume=None, **kwargs):
    """Multi-step lead routing workflow."""
    if resume:
        return {"status": "completed", "steps": [], "result": {"routed": True}}
    return {
        "status": "paused",
        "steps": [{"name": "classify", "status": "completed"}],
        "pendingAction": {
            "step": "approval",
            "type": "confirmation",
            "message": "Route to sales?",
            "options": ["approve", "reject"],
        },
    }

Context Parameter

Add an optional context parameter to any agent to receive execution context such as caller identity and metadata.
@agent(name="my-agent", type="prompt")
async def my_agent(prompt: str, context=None):
    identity = context.get("identity") if context else None
    return f"Hello, {identity or 'anonymous'}!"

Streaming

Use async generators to stream responses. Yield strings to emit text_delta events.
@agent(name="streamer", type="prompt")
async def streamer(prompt: str):
    """Stream response word by word."""
    for word in prompt.split():
        yield word + " "
For finer control over stream events, yield StreamEvent objects:
from reminix_runtime import TextDeltaEvent, StepEvent

@agent(name="workflow-streamer", type="workflow")
async def workflow_streamer(task: str, **kwargs):
    yield StepEvent(name="analyze", status="running")
    yield TextDeltaEvent(delta="Analyzing...")
    yield StepEvent(name="analyze", status="completed", output={"result": "done"})
When an agent function is an async generator (uses yield), the runtime automatically treats it as a streaming agent. Clients must pass stream=True to receive the events.

Schema from Type Hints

When not using a predefined type, input and output schemas are automatically derived from the function signature.
@agent
async def calculator(a: float, b: float, operation: str = "add") -> float:
    """Calculate result of two numbers.

    Args:
        a: First number.
        b: Second number.
        operation: Math operation (add, subtract, multiply, divide).
    """
    if operation == "add":
        return a + b
Type hints map to JSON Schema types:
Python TypeJSON Schema
strstring
intinteger
floatnumber
boolboolean
listarray
dictobject
Pydantic models and TypedDict classes work as type hints — Reminix calls model_json_schema() to generate the JSON Schema for the parameter.

Pydantic models for nested input

Use Pydantic models for nested object parameters. The runtime extracts a full JSON Schema from the model so callers see the structure, but it does not auto-validate at request time — the handler receives a plain dict. Call model_validate inside the handler when you want a typed instance.
from pydantic import BaseModel, Field
from reminix_runtime import agent

class SearchFilters(BaseModel):
    category: str | None = None
    min_score: float = Field(default=0.0, description="Minimum relevance score")
    tags: list[str] = []

@agent
async def search(query: str, filters: dict | None = None) -> list[dict]:
    """Search the document index.

    Args:
        query: Search query string.
        filters: Optional filters to narrow results.
    """
    parsed = SearchFilters.model_validate(filters) if filters else SearchFilters()

    # parsed.category, parsed.min_score, parsed.tags are now typed
    return [{"title": "Result", "category": parsed.category}]
The function parameter type for filters is dict, not SearchFilters — the runtime hands you a dict and you opt into Pydantic validation explicitly. Annotating the parameter as SearchFilters will not auto-coerce or validate; you’ll get an AttributeError the first time you access an attribute.

Multiple Agents

Serve any number of agents from a single process. Each agent gets its own POST /agents/{name}/invoke endpoint.
serve(agents=[hello, support_bot, analyzer])

serve() Function

The serve() function starts a FastAPI server that exposes your agents and tools.
agents
list[Agent]
List of agents to serve.
tools
list[Tool]
List of tools to serve via MCP protocol.
port
int
Port number. Defaults to the PORT environment variable or 8080.
host
str
Host to bind. Defaults to the HOST environment variable or "0.0.0.0".
At least one agent or tool must be provided. Calling serve() with no agents and no tools will raise an error.

Next steps

Creating Tools

Define MCP tools alongside your agents.

Deploying

Ship your serve() to production.

Configuration & Secrets

Pass API keys and config to your handlers.

Frameworks

Wrap OpenAI, Anthropic, LangChain, or Google AI agents.