Skip to main content

Basic Usage

Start a conversation with a chat agent. The response includes a conversation_id you can use to continue the conversation.
from reminix import Reminix

client = Reminix(api_key="your-api-key")

response = client.agents.chat("support-bot", input={
    "messages": [
        {"role": "user", "content": "Hi, I need help with my account"}
    ],
})

print(response.output)
print(response.conversation_id)

Continuing a Conversation

Pass the conversation_id from a previous response to continue the same conversation. Reminix automatically manages the message history.
# First message
chat = client.agents.chat("support-bot", input={
    "messages": [
        {"role": "user", "content": "What's my account status?"}
    ],
})

# Follow-up in the same conversation
follow_up = client.agents.chat("support-bot", input={
    "messages": [
        {"role": "user", "content": "Can you update my email?"}
    ],
    "conversation_id": chat.conversation_id,
})

Parameters

agent_name
str
required
Name of the agent.
input
dict
required
Must contain a messages array. Optionally includes conversation_id to continue an existing conversation.
input.messages
list[dict]
required
OpenAI-compatible message format. Each message is an object with role and content fields.
input.conversation_id
str
Continue an existing conversation. Omit to start a new one.
context
dict
Execution context including identity and metadata.

Message Format

Messages follow the OpenAI-compatible format with role and content fields.
messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Hello!"},
    {"role": "assistant", "content": "Hi! How can I help?"},
    {"role": "user", "content": "What's the weather?"},
]
The system role is optional and typically set by the agent configuration. You can include it to override or augment the agent’s system prompt.

Response

output
str
The agent’s response text.
conversation_id
str
The conversation ID for follow-up messages. Store this to continue the conversation.

With Streaming

Pass stream=True to receive the response incrementally as the agent generates it.
stream = client.agents.chat("support-bot", input={
    "messages": [{"role": "user", "content": "Tell me a story"}],
}, stream=True)

for event in stream:
    if event.type == "text_delta":
        print(event.delta, end="")
See Streaming for all event types.

Next steps

invoke()

Stateless one-shot calls when you don’t need history.

Conversations

How conversation IDs and identity scoping work.

Error Handling

Catch typed exceptions and add retries.

Streaming

Event types and SSE format reference.