Framework Adapters

Wrap popular AI frameworks for use with the Reminix runtime

Reminix provides adapters that let you wrap agents from popular AI frameworks and serve them using the Reminix runtime. This allows you to:

  • Reuse existing agents — No need to rewrite your LangChain or OpenAI-based agents
  • Unified API — All agents expose the same /invoke and /chat endpoints
  • Streaming support — Built-in streaming for all adapters
  • Dashboard integration — Monitor all your agents in one place

Installation

Adapters are included in the reminix package. Install with the optional dependencies you need:

# Core runtime (required)
pip install reminix[runtime]

# Individual adapter dependencies
pip install reminix[openai]      # For OpenAI
pip install reminix[anthropic]   # For Anthropic
pip install reminix[langchain]   # For LangChain
pip install reminix[langgraph]   # For LangGraph
pip install reminix[llamaindex]  # For LlamaIndex

# Or install all adapters at once
pip install reminix[adapters]

# Or install everything
pip install reminix[all]

Available Adapters

FrameworkAdapter Functions
LangChainfrom_langchain_agent, from_agent_executor, from_chat_model
LangGraphfrom_compiled_graph
LlamaIndexfrom_query_engine, from_chat_engine, from_agent
OpenAI SDKfrom_openai
Anthropic SDKfrom_anthropic

Quick Example

from openai import AsyncOpenAI
from reminix.adapters.openai import from_openai
from reminix.runtime import serve

client = AsyncOpenAI()
agent = from_openai(client, name="gpt4-agent", model="gpt-4o")

serve(agent)

That's it! Your OpenAI-powered agent now has:

  • POST /agent/gpt4-agent/invoke — Single-turn requests
  • POST /agent/gpt4-agent/chat — Multi-turn conversations
  • Streaming support with stream: true
  • Health checks and discovery endpoints

On this page