You write the logic. Reminix handles the APIs, streaming, auth, and deployment. TypeScript or Python — any LLM, any framework.
import { z } from "zod"
import OpenAI from "openai"
import { agent, tool, serve } from "@reminix/runtime"
const openai = new OpenAI()
const supportBot = agent("support-bot", {
type: "chat",
handler: async (input) => {
const response = await openai.chat
.completions.create({
model: "gpt-4o",
messages: input.messages,
})
return response.choices[0].message.content
},
})
const lookupCustomer = tool("lookup_customer", {
description: "Find customer by email",
inputSchema: z.object({ email: z.string() }),
handler: async ({ email }) => {
return await crm.find(email)
},
})
serve({
agents: [supportBot],
tools: [lookupCustomer],
})serve() to production.Each step takes minutes, not days. No YAML. No Dockerfiles. No infrastructure to manage.
A handler function, a JSON schema, and whatever LLM or framework you prefer. That's it — no boilerplate, no config files.
serve({ agents, tools })One function call wires everything up. Agents get REST endpoints with streaming. Tools get MCP servers. Schemas become validation and docs automatically.
One command, or connect GitHub for automatic deploys on push. Scaling, secrets, versions, rollbacks — handled. You ship code, not infrastructure.
reminix deployWithout Reminix, you're stitching together API frameworks, auth middleware, SSE plumbing, OAuth token refresh, and monitoring dashboards. With Reminix, you write the logic and deploy. Everything else is handled.
Production endpoints with auth, validation, CORS, and rate limiting. Schema contracts enforced automatically.
Token-by-token streaming with backpressure and client reconnection. No WebSocket plumbing.
Your tools become MCP servers. Any agent or MCP-compatible client can discover and call them.
20+ services out of the box. We handle the OAuth dance, token storage, and automatic refresh. Your code just gets a valid token.
Type-safe Python and TypeScript clients for invoking your agents. Import, connect, call.
Request tracing, error tracking, latency metrics. Built in.
Works with your stack — Reminix never touches your agent's internals.
TypeScript and Python. Any LLM provider.