Deploy agents as REST APIs your apps can call, or tools as MCP servers that AI clients discover automatically.
serve() to production.No YAML. No Dockerfiles. No infra to manage.
A handler, a schema, whatever LLM you prefer.
serve()One call wires up REST endpoints, MCP servers, validation, and docs.
One command or connect GitHub. Scaling, secrets, rollbacks — handled.
import { z } from "zod"
import OpenAI from "openai"
import { agent, tool, serve } from "@reminix/runtime"
const openai = new OpenAI()
const supportBot = agent("support-bot", {
type: "chat",
handler: async (input) => {
const response = await openai.chat
.completions.create({
model: "gpt-4o",
messages: input.messages,
})
return response.choices[0].message.content
},
})
const lookupCustomer = tool("lookup_customer", {
description: "Find customer by email",
inputSchema: z.object({ email: z.string() }),
handler: async ({ email }) => {
return await crm.find(email)
},
})
serve({
agents: [supportBot],
tools: [lookupCustomer],
})Everything between your code and production, handled.
Auth, validation, CORS, rate limiting.
Single endpoint, all tools discoverable.
Token-by-token, with backpressure.
20+ services, automatic token refresh.
Type-safe clients for your agents.
Tracing, errors, latency. Built in.