Agents become REST APIs.
Tools become MCP servers.

You write the logic. Reminix handles the APIs, streaming, auth, and deployment. TypeScript or Python — any LLM, any framework.

TypeScript & PythonAny LLM providerBuilt-in LLM accessOne-command deploy
index.ts
import { z } from "zod"
import OpenAI from "openai"
import { agent, tool, serve } from "@reminix/runtime"

const openai = new OpenAI()

const supportBot = agent("support-bot", {
  type: "chat",
  handler: async (input) => {
    const response = await openai.chat
      .completions.create({
        model: "gpt-4o",
        messages: input.messages,
      })
    return response.choices[0].message.content
  },
})

const lookupCustomer = tool("lookup_customer", {
  description: "Find customer by email",
  inputSchema: z.object({ email: z.string() }),
  handler: async ({ email }) => {
    return await crm.find(email)
  },
})

serve({
  agents: [supportBot],
  tools: [lookupCustomer],
})

From serve() to production.

Each step takes minutes, not days. No YAML. No Dockerfiles. No infrastructure to manage.

1

Define your agents and tools

A handler function, a JSON schema, and whatever LLM or framework you prefer. That's it — no boilerplate, no config files.

2

Add serve({ agents, tools })

One function call wires everything up. Agents get REST endpoints with streaming. Tools get MCP servers. Schemas become validation and docs automatically.

3

Deploy

One command, or connect GitHub for automatic deploys on push. Scaling, secrets, versions, rollbacks — handled. You ship code, not infrastructure.

Terminal
$
reminix deploy
Connect GitHub for auto-deploy on push

Ship code, not infrastructure.

Without Reminix, you're stitching together API frameworks, auth middleware, SSE plumbing, OAuth token refresh, and monitoring dashboards. With Reminix, you write the logic and deploy. Everything else is handled.

REST APIs

Production endpoints with auth, validation, CORS, and rate limiting. Schema contracts enforced automatically.

SSE Streaming

Token-by-token streaming with backpressure and client reconnection. No WebSocket plumbing.

MCP Servers

Your tools become MCP servers. Any agent or MCP-compatible client can discover and call them.

OAuth & Connections

20+ services out of the box. We handle the OAuth dance, token storage, and automatic refresh. Your code just gets a valid token.

SDKs

Type-safe Python and TypeScript clients for invoking your agents. Import, connect, call.

Monitoring & Logs

Request tracing, error tracking, latency metrics. Built in.

Works with your stack — Reminix never touches your agent's internals.

OpenAIAnthropicGeminiVercel AI SDKLangChainCustom Code

TypeScript and Python. Any LLM provider.

Go live in five minutes.

Free while in early access. No credit card required.

Or browse examples