Skip to main content
You write the agent logic. Reminix handles everything else — deploy your handlers in TypeScript or Python and get production REST APIs for agents and MCP servers for tools, with streaming, auth, connections, and monitoring built in.

Define an agent

import { agent, serve } from "@reminix/runtime"

const bot = agent("support-bot", {
  type: "chat",
  description: "Customer support agent",
  handler: async (input) => {
    return "How can I help you today?"
  },
})

serve({ agents: [bot] })

Call it with the SDK

import Reminix from "@reminix/sdk"

const client = new Reminix({ apiKey: "your-api-key" })

const response = await client.agents.chat("support-bot", {
  messages: [{ role: "user", content: "Help me reset my password" }],
})

Or use the CLI

Invoke agents and call tools directly from the terminal — no client code needed.
# Invoke an agent
reminix agent invoke support-bot --prompt "How do I reset my password?"

# Stream the response
reminix agent invoke support-bot --prompt "Summarize this report" --stream

# Call a tool
reminix tool call lookup_customer -i '{"email": "user@example.com"}'
The CLI also works inside AI agents (Claude Code, Cursor), CI/CD pipelines, and shell scripts. See Scripting and CI/CD for more.

Or use your existing framework

Already using Vercel AI SDK, LangChain, OpenAI, or Anthropic? Wrap your existing code and deploy it as a Reminix agent with a few lines.
import { serve } from "@reminix/runtime"
import { VercelAIChatAgent } from "@reminix/vercel-ai"
import { openai } from "@ai-sdk/openai"

const bot = new VercelAIChatAgent(openai("gpt-4o"), {
  name: "support-bot",
  instructions: "You are a helpful support agent.",
})

serve({ agents: [bot] })

Key capabilities

  • Agents → REST APIsTasks (one-shot), Conversations (multi-turn), Workflows (pause/resume)
  • Tools → MCP servers — Define tools served via the Model Context Protocol, discoverable by Claude, Cursor, Windsurf, and any MCP client
  • Built-in LLM access — Every project includes LLM access out of the box. Bring your own keys when you’re ready for production
  • OAuth & Connections — Managed OAuth for 20+ services. Your code gets a valid token — Reminix handles the flow, storage, and refresh
  • Streaming — Real-time Server-Sent Events for progressive responses
  • Framework integrations — Drop-in support for OpenAI, Anthropic, LangChain, Vercel AI SDK, and Google

Next steps

Quickstart

Get up and running in under 5 minutes.

Agents

Learn about agent types and how to define them.

Tools & MCP

Define tools and serve them via MCP.