OpenAI SDK
Wrap OpenAI clients for use with the Reminix runtime
The OpenAI adapter wraps an OpenAI client for use with the Reminix runtime.
Installation
npm install @reminix/runtime @reminix/adapters openai
# or
pnpm add @reminix/runtime @reminix/adapters openaiUsage
import OpenAI from 'openai';
import { fromOpenAI } from '@reminix/adapters/openai';
import { serve } from '@reminix/runtime';
const client = new OpenAI();
const agent = fromOpenAI(client, {
name: 'gpt4-agent',
model: 'gpt-4o',
system: 'You are a helpful assistant.',
});
serve(agent);Options
| Parameter | Type | Default | Description |
|---|---|---|---|
name | string | required | Name for the Reminix agent |
model | string | "gpt-4o" | The model to use |
system | string | undefined | Optional system prompt |
metadata | Record<string, unknown> | undefined | Optional metadata for the agent |
Handler Mapping
| Reminix Handler | OpenAI Method |
|---|---|
onInvoke | client.chat.completions.create({ stream: false }) |
onInvokeStream | client.chat.completions.create({ stream: true }) |
onChat | client.chat.completions.create({ stream: false }) |
onChatStream | client.chat.completions.create({ stream: true }) |
Example: Multi-turn Chat
import OpenAI from 'openai';
import { fromOpenAI } from '@reminix/adapters/openai';
import { serve } from '@reminix/runtime';
const client = new OpenAI();
const agent = fromOpenAI(client, {
name: 'conversational-agent',
model: 'gpt-4o',
system: 'You are a friendly conversational AI. Remember context from previous messages.',
});
serve(agent, { port: 8080 });Test with:
curl -X POST http://localhost:8080/agent/conversational-agent/chat \
-H "Content-Type: application/json" \
-d '{
"messages": [
{"role": "user", "content": "My name is Alice."},
{"role": "assistant", "content": "Nice to meet you, Alice!"},
{"role": "user", "content": "What is my name?"}
]
}'