Skip to main content

Installation

npm install @reminix/langchain @langchain/core
Requires @langchain/core as a peer dependency. You’ll also need a model provider package like @langchain/openai or @langchain/anthropic.

Chat Agent

Use LangChainChatAgent for conversational agents with streaming support.
import { ChatOpenAI } from "@langchain/openai"
import { LangChainChatAgent } from "@reminix/langchain"
import { serve } from "@reminix/runtime"

const model = new ChatOpenAI({ model: "gpt-4o" })

const chatbot = new LangChainChatAgent(model, {
  name: "chatbot",
  instructions: "You are a helpful assistant.",
})

serve({ agents: [chatbot] })

Using Tools

Use createReactAgent from @langchain/langgraph/prebuilt to create an agent with tools, then pass it as the first argument.
import { ChatOpenAI } from "@langchain/openai"
import { tool } from "@langchain/core/tools"
import { z } from "zod"
import { createReactAgent } from "@langchain/langgraph/prebuilt"
import { LangChainChatAgent } from "@reminix/langchain"
import { serve } from "@reminix/runtime"

const getWeather = tool(
  async ({ city }) => {
    const data: Record<string, string> = {
      paris: "Sunny, 22°C",
      london: "Cloudy, 15°C",
      tokyo: "Rainy, 18°C",
    }
    return data[city.toLowerCase()] ?? `Weather data not available for ${city}`
  },
  {
    name: "get_weather",
    description: "Get the current weather for a city",
    schema: z.object({ city: z.string() }),
  }
)

const llm = new ChatOpenAI({ model: "gpt-4o" })
const graph = createReactAgent({ llm, tools: [getWeather] })

const chatbot = new LangChainChatAgent(graph, {
  name: "weather-assistant",
})

serve({ agents: [chatbot] })
You can also pass a compiled LangGraph state graph directly:
import { StateGraph } from "@langchain/langgraph"
import { LangChainChatAgent } from "@reminix/langchain"
import { serve } from "@reminix/runtime"

const graph = new StateGraph({ /* ... */ })
  .addNode("agent", agentNode)
  .addEdge("__start__", "agent")
  .compile()

const chatbot = new LangChainChatAgent(graph, {
  name: "chatbot",
  instructions: "You are a helpful assistant.",
})

serve({ agents: [chatbot] })

Streaming

Chat agents support streaming out of the box:
import Reminix from "@reminix/sdk"

const client = new Reminix()

const stream = await client.agents.chat("chatbot", {
  messages: [{ role: "user", content: "Hello!" }],
  stream: true,
})

for await (const event of stream) {
  if (event.type === "text_delta") {
    process.stdout.write(event.delta)
  }
}

Task Agent

Use LangChainTaskAgent for structured output.
import { ChatOpenAI } from "@langchain/openai"
import { LangChainTaskAgent } from "@reminix/langchain"
import { serve } from "@reminix/runtime"

const model = new ChatOpenAI({ model: "gpt-4o" })

const analyzer = new LangChainTaskAgent(model, {
  name: "analyzer",
  instructions: "Analyze the sentiment of the given text.",
})

serve({ agents: [analyzer] })

Thread Agent

Use LangChainThreadAgent for agents that manage full message history.
import { ChatOpenAI } from "@langchain/openai"
import { LangChainThreadAgent } from "@reminix/langchain"
import { serve } from "@reminix/runtime"

const model = new ChatOpenAI({ model: "gpt-4o" })

const assistant = new LangChainThreadAgent(model, {
  name: "assistant",
  instructions: "You are a helpful assistant that remembers context.",
})

serve({ agents: [assistant] })
Thread agents return the complete message array, including the assistant’s response appended to the input messages.

Options

The first argument to all LangChain agent constructors is a Runnable — either a plain model or a compiled LangGraph CompiledStateGraph.
name
string
required
Agent name. Used as the endpoint identifier.
instructions
string
System prompt for the model.
description
string
Agent description for discovery and documentation.
tags
string[]
Tags for filtering and organizing agents.
metadata
Record<string, unknown>
Additional metadata attached to the agent.

Next steps

Deploying

Ship your LangChain agent to production.

Configuration & Secrets

Where to put model provider API keys.

Vercel AI

A lighter-weight alternative to LangChain.

Python: LangChain

The same integration in Python.