Overview
Reminix agents are standard HTTP servers. Deploy them anywhere you can run Python or Node.js applications.
Looking for the quickest path to production?Reminix Cloud handles infrastructure, scaling, and monitoring for you — just push your code and go. This guide covers self-hosting for teams who prefer to run on their own infrastructure.
Deployment Options
Reminix offers two ways to deploy your agents:
| Feature | Self-Hosting | Reminix Cloud |
|---|
| Setup | Deploy anywhere | Push to GitHub |
| Infrastructure | You manage | Fully managed |
| Scaling | Manual | Automatic |
| Monitoring | Your own tools | Built-in dashboard |
| Cost | Infrastructure costs | Usage-based |
| Best for | Full control, compliance needs | Fast deployment, zero DevOps |
Both options use the same open source Reminix Runtime, so you can switch between them anytime. No vendor lock-in.
Docker
FROM python:3.11-slim
WORKDIR /app
COPY pyproject.toml .
RUN pip install .
COPY main.py .
EXPOSE 8080
CMD ["python", "main.py"]
docker build -t my-agent .
docker run -p 8080:8080 -e OPENAI_API_KEY=sk-... my-agent
FROM node:20-slim
WORKDIR /app
COPY package*.json .
RUN npm install
COPY . .
RUN npm run build
EXPOSE 8080
CMD ["node", "dist/index.js"]
docker build -t my-agent .
docker run -p 8080:8080 -e OPENAI_API_KEY=sk-... my-agent
Kubernetes
apiVersion: apps/v1
kind: Deployment
metadata:
name: my-agent
spec:
replicas: 2
selector:
matchLabels:
app: my-agent
template:
metadata:
labels:
app: my-agent
spec:
containers:
- name: my-agent
image: my-agent:latest
ports:
- containerPort: 8080
env:
- name: PORT
value: "8080"
- name: OPENAI_API_KEY
valueFrom:
secretKeyRef:
name: agent-secrets
key: openai-api-key
livenessProbe:
httpGet:
path: /health
port: 8080
initialDelaySeconds: 10
periodSeconds: 30
readinessProbe:
httpGet:
path: /health
port: 8080
initialDelaySeconds: 5
periodSeconds: 10
---
apiVersion: v1
kind: Service
metadata:
name: my-agent
spec:
selector:
app: my-agent
ports:
- port: 80
targetPort: 8080
type: LoadBalancer
Create the secret:
kubectl create secret generic agent-secrets \
--from-literal=openai-api-key=sk-...
Multiple Agents
You can serve multiple agents from a single deployment. This is useful for related agents that share dependencies.
Docker with Multiple Agents
# main.py
from reminix_runtime import agent, serve
import os
@agent
async def summarizer(text: str) -> str:
"""Summarize text."""
return f"Summary: {text[:100]}..."
@agent
async def translator(text: str, target: str = "es") -> str:
"""Translate text."""
return f"Translated to {target}: {text}"
if __name__ == "__main__":
port = int(os.environ.get("PORT", 8080))
serve(agents=[summarizer, translator], port=port)
docker run -p 8080:8080 my-multi-agent
// src/index.ts
import { agent, serve } from '@reminix/runtime';
const summarizer = agent('summarizer', {
description: 'Summarize text',
handler: async ({ text }) => `Summary: ${(text as string).slice(0, 100)}...`
});
const translator = agent('translator', {
description: 'Translate text',
handler: async ({ text, target }) => `Translated to ${target || 'es'}: ${text}`
});
const port = parseInt(process.env.PORT || '8080');
serve({ agents: [summarizer, translator], port });
docker run -p 8080:8080 my-multi-agent
Kubernetes with Multiple Agents
The same deployment serves all agents. Call specific agents by name:
# Call the summarizer agent
curl -X POST http://my-agent/agents/summarizer/invoke \
-H "Content-Type: application/json" \
-d '{"text": "Long document..."}'
# Call the translator agent
curl -X POST http://my-agent/agents/translator/invoke \
-H "Content-Type: application/json" \
-d '{"text": "Hello", "target": "es"}'
Use the /info endpoint to discover available agents:
curl http://my-agent/info
# {"agents": [{"name": "summarizer", ...}, {"name": "translator", ...}]}
Serverless
For serverless deployments, use toHandler() (TypeScript) or to_asgi() (Python) to get a handler compatible with edge/serverless platforms.
TypeScript
// Vercel Edge Function (app/api/agent/route.ts)
import { agent } from '@reminix/runtime';
const myAgent = agent('my-agent', {
description: 'A serverless agent',
handler: async ({ prompt }) => `Completed: ${prompt}`
});
// Export the handler
export const POST = myAgent.toHandler();
export const GET = myAgent.toHandler();
Compatible platforms:
- Vercel Edge Functions - Export the handler directly
- Cloudflare Workers -
export default { fetch: agent.toHandler() }
- Deno Deploy -
Deno.serve(agent.toHandler())
- Bun -
Bun.serve({ fetch: agent.toHandler() })
Python
# AWS Lambda with Mangum
from mangum import Mangum
from reminix_runtime import agent
@agent
async def my_agent(prompt: str) -> str:
"""A serverless agent."""
return f"Completed: {prompt}"
# Lambda handler
handler = Mangum(my_agent.to_asgi())
Compatible platforms:
- AWS Lambda - Use Mangum adapter
- GCP Cloud Functions - Use functions-framework with ASGI
- Any ASGI server - uvicorn, hypercorn, daphne
Railway
railway login
railway init
railway up
Set environment variables in the Railway dashboard.
Render
- Connect your GitHub repository
- Set environment variables in the dashboard
- Render auto-deploys on push
Fly.io
fly launch
fly secrets set OPENAI_API_KEY=sk-...
fly deploy
AWS
App Runner (easiest):
- Create App Runner service
- Connect to your container registry or GitHub
- Set environment variables in configuration
ECS/Fargate:
- Push image to ECR
- Create task definition with environment variables
- Create ECS service
Lambda (container image):
- Push image to ECR
- Create Lambda function from container image
- Set environment variables in configuration
Google Cloud
Cloud Run (recommended):
gcloud run deploy my-agent \
--image gcr.io/PROJECT/my-agent \
--platform managed \
--set-env-vars OPENAI_API_KEY=sk-...
GKE: Use the Kubernetes configuration above.
Azure
Container Apps:
az containerapp create \
--name my-agent \
--resource-group mygroup \
--image my-agent:latest \
--target-port 8080 \
--env-vars OPENAI_API_KEY=sk-...
Environment Variables
Configure your agent using environment variables:
| Variable | Description |
|---|
PORT | Server port (default: 8080) |
OPENAI_API_KEY | OpenAI API key |
ANTHROPIC_API_KEY | Anthropic API key |
DATABASE_URL | Database connection string |
import os
port = int(os.environ.get("PORT", 8080))
api_key = os.environ.get("OPENAI_API_KEY")
const port = parseInt(process.env.PORT || '8080');
const apiKey = process.env.OPENAI_API_KEY;
Health Checks
The runtime exposes /health automatically. Configure your platform to use it:
curl http://localhost:8080/health
# {"status": "ok"}
Most platforms support health check configuration:
- Kubernetes:
livenessProbe and readinessProbe
- Docker Compose:
healthcheck
- AWS ECS: Health check in task definition
- Cloud Run: Automatic via
/health
Connecting to Self-Hosted Agents
Point the SDK to your self-hosted agent:
from reminix import Reminix
client = Reminix(base_url="https://my-agent.example.com")
response = client.agents.invoke("my-agent", prompt="Hello!")
print(response["content"])
import Reminix from '@reminix/sdk';
const client = new Reminix({ baseURL: 'https://my-agent.example.com' });
const response = await client.agents.invoke('my-agent', {
prompt: 'Hello!'
});
console.log(response.content);
Next Steps