Skip to main content

Requirements

Before deploying, make sure your project meets these requirements:
  • reminix-runtime must be listed in your dependencies
  • Your code must call serve() to start the server
  • You must have a recognized entrypoint file

Entrypoints

Reminix looks for these files in order and uses the first one found:
  1. server.py
  2. agent.py
  3. main.py

Project Structure

my-agent/
├── server.py         # Entrypoint — calls serve()
├── requirements.txt  # or pyproject.toml
└── ...

Minimal Example

# server.py
from reminix_runtime import agent, serve

@agent(name="my-agent", type="prompt")
async def my_agent(prompt: str) -> str:
    return f"You said: {prompt}"

serve(agents=[my_agent])

Package Managers

reminix-runtime
openai
[tool.poetry.dependencies]
python = "^3.11"
reminix-runtime = "*"
openai = "*"
[project]
dependencies = [
    "reminix-runtime",
    "openai",
]

What Happens When You Deploy

1

Detect

Reminix detects your Python project and locates the entrypoint file (server.py, agent.py, or main.py).
2

Build

A Docker image is built with your dependencies installed from requirements.txt or pyproject.toml.
3

Deploy

The container is deployed and your server starts on port 8080.
4

Discover

Reminix automatically discovers all agents and tools defined in your server.
5

Live

Your agents are accessible via the REST API at POST /v1/agents/{name}/invoke.

Environment Variables

Set secrets and configuration values in the Reminix Dashboard. They are available to your agent at runtime as standard environment variables.
  • PORT is set automatically (defaults to 8080)
  • Add any custom variables your agent needs (API keys, database URLs, etc.)
Your server must listen on 0.0.0.0 (the default for serve()). Binding to localhost or 127.0.0.1 will make your agent unreachable inside the container.

With Framework Packages

If using framework integrations, add them to your dependencies alongside reminix-runtime.
reminix-runtime
reminix-langchain
langchain-openai
Then use the framework package to create agents without writing a handler manually:
from reminix_runtime import serve
from reminix_langchain import LangChainChatAgent
from langchain_openai import ChatOpenAI

bot = LangChainChatAgent(
    ChatOpenAI(model="gpt-4o"),
    name="gpt-agent",
    instructions="You are a helpful assistant.",
)

serve(agents=[bot])
Framework packages like reminix-langchain handle prompt formatting, API calls, and response parsing automatically. You just configure the model and instructions.

Next steps

Deploy from GitHub

Connect your repo and ship on every push.

Configuration & Secrets

How to set environment variables and read them at runtime.

CLI Reference

reminix deploy and other command-line tools.

Troubleshooting

What to check when a deploy or invocation fails.