Skip to main content

Installation

npm install @reminix/vercel-ai ai
Requires ai version 6.0.0 or later as a peer dependency.

Chat Agent

Use VercelAIChatAgent for conversational agents with streaming support. You can pass either a LanguageModel or a ToolLoopAgent as the first argument.
import { openai } from "@ai-sdk/openai"
import { VercelAIChatAgent } from "@reminix/vercel-ai"
import { serve } from "@reminix/runtime"

const chatbot = new VercelAIChatAgent(openai("gpt-4o"), {
  name: "chatbot",
  instructions: "You are a helpful assistant.",
})

serve({ agents: [chatbot] })

Using Tools

Create a ToolLoopAgent with tools and pass it instead of a plain model. The agent will automatically call tools in a loop until it produces a final response.
import { openai } from "@ai-sdk/openai"
import { ToolLoopAgent, tool } from "ai"
import { z } from "zod"
import { VercelAIChatAgent } from "@reminix/vercel-ai"
import { serve } from "@reminix/runtime"

const getWeather = tool({
  description: "Get the current weather for a city",
  inputSchema: z.object({ city: z.string() }),
  execute: async ({ city }) => {
    const data: Record<string, string> = {
      paris: "Sunny, 22°C",
      london: "Cloudy, 15°C",
      tokyo: "Rainy, 18°C",
    }
    return data[city.toLowerCase()] ?? `Weather data not available for ${city}`
  },
})

const toolAgent = new ToolLoopAgent({
  model: openai("gpt-4o-mini"),
  tools: { getWeather },
})

const chatbot = new VercelAIChatAgent(toolAgent, {
  name: "weather-assistant",
})

serve({ agents: [chatbot] })
ToolLoopAgent works with all three agent types — VercelAIChatAgent, VercelAITaskAgent, and VercelAIThreadAgent.

Streaming

Chat agents support streaming out of the box:
import Reminix from "@reminix/sdk"

const client = new Reminix()

const stream = await client.agents.chat("chatbot", {
  messages: [{ role: "user", content: "Hello!" }],
  stream: true,
})

for await (const event of stream) {
  if (event.type === "text_delta") {
    process.stdout.write(event.delta)
  }
}

Task Agent

Use VercelAITaskAgent for structured output with a defined schema.
import { openai } from "@ai-sdk/openai"
import { VercelAITaskAgent } from "@reminix/vercel-ai"
import { serve } from "@reminix/runtime"

const analyzer = new VercelAITaskAgent(openai("gpt-4o"), {
  name: "analyzer",
  instructions: "Analyze the sentiment of the given text.",
  outputSchema: {
    type: "object",
    properties: {
      sentiment: { type: "string", enum: ["positive", "negative", "neutral"] },
      confidence: { type: "number" },
    },
    required: ["sentiment", "confidence"],
  },
})

serve({ agents: [analyzer] })

Thread Agent

Use VercelAIThreadAgent for agents that manage full message history.
import { openai } from "@ai-sdk/openai"
import { VercelAIThreadAgent } from "@reminix/vercel-ai"
import { serve } from "@reminix/runtime"

const assistant = new VercelAIThreadAgent(openai("gpt-4o"), {
  name: "assistant",
  instructions: "You are a helpful assistant that remembers context.",
})

serve({ agents: [assistant] })
Thread agents return the complete message array, including the assistant’s response appended to the input messages.

Options

The first argument to all Vercel AI agent constructors is a LanguageModel instance (e.g. from @ai-sdk/openai, @ai-sdk/anthropic) or a ToolLoopAgent.
name
string
required
Agent name. Used as the endpoint identifier.
instructions
string
System prompt for the model.
description
string
Agent description for discovery and documentation.
outputSchema
JSONSchema
Output schema for task agents. Defines the structured output format.
tags
string[]
Tags for filtering and organizing agents.
metadata
Record<string, unknown>
Additional metadata attached to the agent.

Next steps

Deploying

Ship your Vercel AI agent to production.

Creating Agents

The plain agent() factory if you outgrow the wrapper.

OpenAI

Direct OpenAI SDK integration.

Anthropic

Direct Anthropic SDK integration.