Overview
@reminix/vercel-ai lets you deploy agents built with the Vercel AI SDK to Reminix. Supports both:
- ToolLoopAgent - Full agents with tools and automatic tool loop handling
- Model - Simple completions via
generateText/streamText without tools
Installation
npm install @reminix/vercel-ai ai
This will also install @reminix/runtime as a dependency.
Quick Start with Model
For simple completions, pass the model directly:
import { openai } from '@ai-sdk/openai';
import { serveAgent } from '@reminix/vercel-ai';
const model = openai('gpt-4o');
serveAgent(model, { name: 'chat-agent', port: 8080 });
Quick Start with ToolLoopAgent
For agents with tools, use ToolLoopAgent:
import { ToolLoopAgent, tool } from 'ai';
import { openai } from '@ai-sdk/openai';
import { z } from 'zod';
import { serveAgent } from '@reminix/vercel-ai';
const weatherTool = tool({
description: 'Get the current weather for a city',
inputSchema: z.object({
city: z.string()
}),
handler: async ({ city }) => {
return { city, temp: 72, condition: 'sunny' };
}
});
const agent = new ToolLoopAgent({
model: openai('gpt-4o'),
instructions: 'You help users check the weather.',
tools: { getWeather: weatherTool }
});
serveAgent(agent, { name: 'weather-agent', port: 8080 });
ToolLoopAgent with Multiple Tools
import { ToolLoopAgent, tool } from 'ai';
import { openai } from '@ai-sdk/openai';
import { z } from 'zod';
const searchProducts = tool({
description: 'Search the product catalog',
inputSchema: z.object({
query: z.string().describe('Search query'),
category: z.string().optional().describe('Category filter')
}),
handler: async ({ query, category }) => {
return await catalog.search(query, { category });
}
});
const getProductDetails = tool({
description: 'Get details for a specific product',
inputSchema: z.object({
productId: z.string()
}),
handler: async ({ productId }) => {
return await catalog.get(productId);
}
});
const agent = new ToolLoopAgent({
model: openai('gpt-4o'),
instructions: 'You help users find and learn about products.',
tools: { searchProducts, getProductDetails }
});
const reminixAgent = wrapAgent(agent, { name: 'shopping-assistant' });
With Different Providers
Use any AI provider supported by the Vercel AI SDK:
Anthropic
import { ToolLoopAgent } from 'ai';
import { anthropic } from '@ai-sdk/anthropic';
const agent = new ToolLoopAgent({
model: anthropic('claude-sonnet-4-20250514'),
instructions: 'You are Claude.',
tools: { ... }
});
const reminixAgent = wrapAgent(agent, { name: 'claude-agent' });
Google
import { ToolLoopAgent } from 'ai';
import { google } from '@ai-sdk/google';
const agent = new ToolLoopAgent({
model: google('gemini-1.5-pro'),
instructions: 'You are Gemini.',
tools: { ... }
});
const reminixAgent = wrapAgent(agent, { name: 'gemini-agent' });
When to Use Each Option
| Option | Use Case |
|---|
| ToolLoopAgent | Agents that need tools, multi-step reasoning, automatic tool loop |
| Model | Simple completions, providers without dedicated adapters (Google, Mistral, etc.) |
For simple completions with OpenAI or Anthropic, you can also use @reminix/openai or @reminix/anthropic directly.
Multiple Agents
For multi-agent projects, use wrapAgent + serve instead of serveAgent:
import { wrapAgent } from '@reminix/vercel-ai';
import { serve } from '@reminix/runtime';
const weather = wrapAgent(weatherAgent, 'weather-agent');
const shopping = wrapAgent(shoppingAgent, 'shopping-agent');
serve({ agents: [weather, shopping], port: 8080 });
Usage
Once deployed, call your agent using invoke. See Agents for detailed guidance.
With Prompt
For task-oriented operations:
import Reminix from '@reminix/sdk';
const client = new Reminix();
const response = await client.agents.invoke('weather-agent', {
prompt: 'What is the weather in Tokyo?'
});
console.log(response.output);
With Messages
For conversations with message history:
const response = await client.agents.invoke('weather-agent', {
messages: [
{ role: 'user', content: 'What is the weather in Tokyo?' },
{ role: 'assistant', content: 'It is 72°F and sunny in Tokyo.' },
{ role: 'user', content: 'What about New York?' }
]
});
console.log(response.output);
Streaming
For real-time streaming responses:
const stream = await client.agents.invoke('weather-agent', {
prompt: 'What is the weather in Tokyo?',
stream: true
});
for await (const chunk of stream) {
process.stdout.write(chunk);
}
Deployment
See the Deployment guide for deploying to Reminix, or Self-Hosting for deploying on your own infrastructure.
Set your API keys in Project Settings → Secrets.
Next Steps