OpenAI SDK

Wrap OpenAI clients for use with the Reminix runtime

The OpenAI adapter wraps an AsyncOpenAI client for use with the Reminix runtime.

Installation

pip install reminix[runtime,openai]

Usage

from openai import AsyncOpenAI
from reminix.adapters.openai import from_openai
from reminix.runtime import serve

client = AsyncOpenAI()
agent = from_openai(
    client,
    name="gpt4-agent",
    model="gpt-4o",
    system="You are a helpful assistant.",
)

serve(agent)

Options

ParameterTypeDefaultDescription
clientAsyncOpenAIrequiredThe OpenAI client instance
namestrrequiredName for the Reminix agent
modelstr"gpt-4o"The model to use
systemstr | NoneNoneOptional system prompt
metadatadictNoneOptional metadata for the agent

Handler Mapping

Reminix HandlerOpenAI Method
@agent.invokeclient.chat.completions.create(stream=False)
@agent.invoke_streamclient.chat.completions.create(stream=True)
@agent.chatclient.chat.completions.create(stream=False)
@agent.chat_streamclient.chat.completions.create(stream=True)

Example: Multi-turn Chat

from openai import AsyncOpenAI
from reminix.adapters.openai import from_openai
from reminix.runtime import serve

client = AsyncOpenAI()
agent = from_openai(
    client,
    name="conversational-agent",
    model="gpt-4o",
    system="You are a friendly conversational AI. Remember context from previous messages.",
)

if __name__ == "__main__":
    serve(agent, port=8080)

Test with:

curl -X POST http://localhost:8080/agent/conversational-agent/chat \
  -H "Content-Type: application/json" \
  -d '{
    "messages": [
      {"role": "user", "content": "My name is Alice."},
      {"role": "assistant", "content": "Nice to meet you, Alice!"},
      {"role": "user", "content": "What is my name?"}
    ]
  }'

On this page