Skip to main content

Overview

@reminix/langgraph lets you deploy LangGraph.js state machines and workflows as Reminix agents.

Installation

npm install @reminix/langgraph
This will also install @reminix/runtime as a dependency.

Quick Start

Wrap your LangGraph workflow and serve it:
import { StateGraph, END } from '@langchain/langgraph';
import { serveAgent } from '@reminix/langgraph';

// Define your state
interface AgentState {
  messages: string[];
  nextStep: string;
}

// Define nodes
async function processNode(state: AgentState) {
  return { nextStep: 'respond' };
}

async function respondNode(state: AgentState) {
  return { messages: [...state.messages, 'Response'] };
}

// Create graph
const graph = new StateGraph<AgentState>({
  channels: {
    messages: { value: (a, b) => [...a, ...b] },
    nextStep: { value: (_, b) => b }
  }
});

graph.addNode('process', processNode);
graph.addNode('respond', respondNode);
graph.addEdge('process', 'respond');
graph.addEdge('respond', END);
graph.setEntryPoint('process');

const workflow = graph.compile();

// Wrap and serve
serveAgent(workflow, { name: 'workflow-agent', port: 8080 });

Wrapping Workflows

Basic Graph

import { StateGraph } from '@langchain/langgraph';
import { wrapAgent } from '@reminix/langgraph';

const graph = new StateGraph<MyState>({ channels });
// ... define nodes and edges
const workflow = graph.compile();

const agent = wrapAgent(workflow, { name: 'my-workflow' });

With Checkpointing

import { MemorySaver } from '@langchain/langgraph';
import { wrapAgent } from '@reminix/langgraph';

const checkpointer = new MemorySaver();
const workflow = graph.compile({ checkpointer });

const agent = wrapAgent(workflow, {
  name: 'stateful-workflow',
  checkpointer
});

Prebuilt Agents

import { createReactAgent } from '@langchain/langgraph/prebuilt';
import { wrapAgent } from '@reminix/langgraph';

const agent = createReactAgent({ llm, tools });
const reminixAgent = wrapAgent(agent, { name: 'react-agent' });

Configuration

const agent = wrapAgent(workflow, {
  name: 'my-workflow',
  description: 'Multi-step processing workflow',

  // State mapping
  inputKey: 'messages',
  outputKey: 'messages',

  // Streaming
  streaming: true,
  streamMode: 'updates'  // 'values' or 'updates'
});

Multiple Agents

For multi-agent projects, use wrapAgent + serve instead of serveAgent:
import { wrapAgent } from '@reminix/langgraph';
import { serve } from '@reminix/runtime';

const workflow1 = wrapAgent(graph1.compile(), 'research-workflow');
const workflow2 = wrapAgent(graph2.compile(), 'writing-workflow');

serve({ agents: [workflow1, workflow2], port: 8080 });
See Multiple Agents for detailed guidance on multi-agent projects.

State Management

Input Mapping

Map Reminix input to LangGraph state:
const agent = wrapAgent(workflow, {
  name: 'my-workflow',
  inputMapper: (reminixInput) => ({
    messages: [{ role: 'user', content: reminixInput.query }],
    context: reminixInput.context ?? {}
  })
});

Output Mapping

Map LangGraph output to Reminix response:
const agent = wrapAgent(workflow, {
  name: 'my-workflow',
  outputMapper: (state) => ({
    response: state.messages[state.messages.length - 1].content,
    steps: state.messages.length
  })
});

Streaming

Stream graph updates in real-time:
import Reminix from '@reminix/sdk';

const client = new Reminix();

const stream = await client.agents.invoke('workflow-agent', {
  messages: [{ role: 'user', content: 'Process this' }],
  stream: true
});

for await (const chunk of stream) {
  process.stdout.write(chunk);
}

Example: Multi-Step Research Agent

import { StateGraph, END } from '@langchain/langgraph';
import { wrapAgent } from '@reminix/langgraph';
import { serve } from '@reminix/runtime';

interface ResearchState {
  query: string;
  searchResults: any[];
  analysis: string;
  summary: string;
}

async function search(state: ResearchState) {
  const results = await webSearch(state.query);
  return { searchResults: results };
}

async function analyze(state: ResearchState) {
  const analysis = await analyzeResults(state.searchResults);
  return { analysis };
}

async function summarize(state: ResearchState) {
  const summary = await createSummary(state.analysis);
  return { summary };
}

const graph = new StateGraph<ResearchState>({ channels });
graph.addNode('search', search);
graph.addNode('analyze', analyze);
graph.addNode('summarize', summarize);
graph.addEdge('search', 'analyze');
graph.addEdge('analyze', 'summarize');
graph.addEdge('summarize', END);
graph.setEntryPoint('search');

const workflow = graph.compile();
const agent = wrapAgent(workflow, 'research-agent');

serve({ agents: [agent], port: 8080 });

Usage

Once deployed, call your agent using invoke. See Agents for detailed guidance.

With Messages

For graph-based workflows with message history:
import Reminix from '@reminix/sdk';

const client = new Reminix();

const response = await client.agents.invoke('workflow-agent', {
  messages: [{ role: 'user', content: 'Research AI trends' }]
});

console.log(response.output);

With Custom Input

For workflows with custom input keys:
const response = await client.agents.invoke('research-agent', {
  query: 'Research AI trends in healthcare'
});

console.log(response.output);

Streaming

For real-time streaming responses:
const stream = await client.agents.invoke('workflow-agent', {
  messages: [{ role: 'user', content: 'Research AI trends' }],
  stream: true
});

for await (const chunk of stream) {
  process.stdout.write(chunk);
}

Deployment

See the Deployment guide for deploying to Reminix, or Self-Hosting for deploying on your own infrastructure.

Next Steps