Anthropic SDK

Wrap Anthropic clients for use with the Reminix runtime

The Anthropic adapter wraps an AsyncAnthropic client for use with the Reminix runtime.

Installation

pip install reminix[runtime,anthropic]

Usage

from anthropic import AsyncAnthropic
from reminix.adapters.anthropic import from_anthropic
from reminix.runtime import serve

client = AsyncAnthropic()
agent = from_anthropic(
    client,
    name="claude-agent",
    model="claude-sonnet-4-20250514",
    system="You are a helpful assistant.",
)

serve(agent)

Options

ParameterTypeDefaultDescription
clientAsyncAnthropicrequiredThe Anthropic client instance
namestrrequiredName for the Reminix agent
modelstr"claude-sonnet-4-20250514"The model to use
systemstr | NoneNoneOptional system prompt
max_tokensint4096Maximum tokens in the response
metadatadictNoneOptional metadata for the agent

Handler Mapping

Reminix HandlerAnthropic Method
@agent.invokeclient.messages.create()
@agent.invoke_streamclient.messages.stream()
@agent.chatclient.messages.create()
@agent.chat_streamclient.messages.stream()

Example: Streaming Response

from anthropic import AsyncAnthropic
from reminix.adapters.anthropic import from_anthropic
from reminix.runtime import serve

client = AsyncAnthropic()
agent = from_anthropic(
    client,
    name="claude-stream",
    model="claude-sonnet-4-20250514",
    system="You are a creative writing assistant.",
    max_tokens=2048,
)

if __name__ == "__main__":
    serve(agent, port=8080)

Test streaming:

curl -X POST http://localhost:8080/agent/claude-stream/invoke \
  -H "Content-Type: application/json" \
  -d '{"input": {"prompt": "Write a haiku about coding."}, "stream": true}'

On this page