Your agent works.
The API shouldn't take two weeks.

Auth, streaming, validation, rate limiting, CORS, monitoring — that's weeks of infra work. Or one serve() call with Reminix.

Define it. Deploy it.

A schema and a handler. Use whatever LLM or framework you want — Reminix never touches your agent's internals.

  • Production REST API with auth and validation
  • SSE streaming with backpressure
  • Schema validation on input and output
  • Request tracing, error tracking, latency metrics
agents/support-bot.ts
import OpenAI from "openai"
import { agent } from "@reminix/runtime"

const openai = new OpenAI()

export const supportBot = agent("support-bot", {
  type: "chat",
  description: "Customer support assistant",
  handler: async (input) => {
    const response = await openai.chat
      .completions.create({
        model: "gpt-4o",
        messages: input.messages,
      })
    return response.choices[0].message.content
  },
})

One deploy. Four endpoints.

Reminix generates production endpoints from your agent definition.

POST/v1/agents/{name}/invoke
POST/v1/agents/{name}/chat
POST/v1/agents/{name}/workflow
GET/v1/agents/{name}

Or call it from your terminal.

Invoke agents, stream responses, and run multi-turn conversations directly from the CLI. Pipe output into scripts, CI/CD pipelines, or other AI agents.

Terminal
# Invoke an agent
$reminix agent invoke support-bot \
--prompt "How do I reset my password?"
# Multi-turn chat
$reminix agent chat support-bot \
--message "Hello" --stream

Simple tasks. Conversations. Workflows.

Not every agent is a chatbot. Reminix supports all three patterns out of the box.

Invoke

Send input, get output. Supports streaming.

Data processing, extraction, analysis

Conversations

Multi-turn with managed sessions.

Support bots, assistants, research

Workflows

Stateful processes that pause and resume.

Pipelines, approvals, orchestration

Connect to 20+ services.

Google Calendar, Slack, GitHub, Notion — Reminix handles the full OAuth flow. Your code just gets a valid token.

using-connections.ts
// Get a fresh token — auto-refreshed
const { access_token } = await client
  .oauthConnections.getToken("google")

// Use Google's own SDK directly
const calendar = google.calendar({
  version: "v3",
  auth: access_token,
})

const events = await calendar.events.list({
  calendarId: "primary",
  timeMin: new Date().toISOString(),
})

Bring your own framework.

LangChain, Vercel AI SDK, OpenAI, Anthropic, Gemini, or custom code. Wrap it in a Reminix agent and deploy.

LangChainLangChain
Vercel AI SDKVercel AI SDK
OpenAIOpenAI
AnthropicAnthropic
GeminiGemini

Your agent deserves a real API.

Deploy in five minutes. Start free — no credit card required.