Skip to main content

Overview

@reminix/openai provides a simple wrapper for the OpenAI API. This is a model-only adapter for basic chat completions without tool calling.
For agents with tool calling, use LangChain, LangGraph, or Vercel AI SDK instead.

Installation

npm install @reminix/openai
This will also install @reminix/runtime as a dependency.

Quick Start

Wrap an OpenAI client and serve it:
import OpenAI from 'openai';
import { serveAgent } from '@reminix/openai';

const client = new OpenAI();
serveAgent(client, { name: 'chat-agent', model: 'gpt-4o', port: 8080 });

Configuration

const agent = wrapAgent(client, {
  name: 'my-agent',
  model: 'gpt-4o'  // Model to use for completions
});

Multiple Agents

For multi-agent projects, use wrapAgent + serve instead of serveAgent:
import { wrapAgent } from '@reminix/openai';
import { serve } from '@reminix/runtime';

const gpt4 = wrapAgent(client, { name: 'gpt4-agent', model: 'gpt-4o' });
const gpt35 = wrapAgent(client, { name: 'gpt35-agent', model: 'gpt-3.5-turbo' });

serve({ agents: [gpt4, gpt35], port: 8080 });
See Multiple Agents for detailed guidance on multi-agent projects.

Usage

With Prompt

Single request/response with a prompt:
import Reminix from '@reminix/sdk';

const client = new Reminix();

const response = await client.agents.invoke('chat-agent', {
  prompt: 'Explain quantum computing in one sentence'
});

console.log(response.output);

With Messages

For conversations with message history:
const response = await client.agents.invoke('chat-agent', {
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' },
    { role: 'user', content: 'What is TypeScript?' },
    { role: 'assistant', content: 'TypeScript is a typed superset of JavaScript.' },
    { role: 'user', content: 'What are its main benefits?' }
  ]
});

console.log(response.output);

Streaming

For real-time streaming responses:
const stream = await client.agents.invoke('chat-agent', {
  prompt: 'Tell me a story',
  stream: true
});

for await (const chunk of stream) {
  process.stdout.write(chunk);
}

When to Use This Adapter

Use @reminix/openai when:
  • You need simple chat completions
  • No tool calling required
  • Direct model access is sufficient
Use LangChain/LangGraph/Vercel AI instead when:
  • You need tool calling / function calling
  • You need agents with memory
  • You need complex workflows

Deployment

See the Deployment guide for deploying to Reminix, or Self-Hosting for deploying on your own infrastructure. Set your OpenAI API key in Project Settings → Secrets.

Next Steps