Basic Invoke
Invoke an agent and get a result:
import Reminix from '@reminix/sdk';
const client = new Reminix();
const response = await client.agents.invoke('my-agent', {
input: {
prompt: 'Analyze this data',
data: { sales: [100, 200, 150] }
}
});
console.log(response.output);
The API uses input and output only. Request body: { input: { ... }, stream?: boolean }. Response: { output: ..., execution?: { id, url, type, status?, duration_ms? } }.
// Task agent with custom input
const response = await client.agents.invoke('data-analyzer', {
input: {
task: 'analyze_trends',
data: {
values: [100, 200, 150, 300],
labels: ['Q1', 'Q2', 'Q3', 'Q4']
},
options: { includeForecast: true }
}
});
// Agent with prompt input
const response = await client.agents.invoke('code-generator', {
input: { prompt: 'A function to calculate fibonacci numbers' }
});
// For chat-style interactions with messages, use client.agents.chat() instead
// See the Chat guide: /typescript/chat
Response
The response is { output: ..., execution?: { id, url, type, status?, duration_ms? } }. Use response.output for the result.
const response = await client.agents.invoke('my-agent', { input: { prompt: 'Hello' } });
console.log(response.output);
For chat-style interactions, use client.agents.chat() instead which returns a standardized messages array. See the Chat guide for details.
With Context
Pass additional context to your agent:
const response = await client.agents.invoke('my-agent', {
prompt: 'personalized analysis',
context: {
identity: {
user_id: 'user_456'
},
tenant_id: 'tenant_xyz',
user_preferences: { language: 'en' }
}
});
Streaming
Stream responses for real-time output:
const stream = await client.agents.invoke('my-agent', {
prompt: 'Write a story',
stream: true
});
for await (const chunk of stream) {
process.stdout.write(chunk);
}
Collecting Streamed Response
const chunks: string[] = [];
const stream = await client.agents.invoke('my-agent', {
prompt: 'Write a story',
stream: true
});
for await (const chunk of stream) {
chunks.push(chunk);
process.stdout.write(chunk);
}
const fullResponse = chunks.join('');
Idempotency
Prevent duplicate processing with idempotency keys:
const response = await client.agents.invoke('payment-processor', {
action: 'charge',
amount: 100,
idempotencyKey: 'charge_abc123'
});
// Same key returns cached response (for 24 hours)
const response2 = await client.agents.invoke('payment-processor', {
action: 'charge',
amount: 100,
idempotencyKey: 'charge_abc123'
});
// response2 is the same as response
Idempotency only works for non-streaming requests. Streaming responses are not cached.
Handling the Response
The response structure depends on your agent’s configuration:
const response = await client.agents.invoke('my-agent', { prompt: 'Hello' });
// Access the output (structure depends on agent)
const output = response.output;
// Type narrowing for different output types
if (typeof output === 'object' && output !== null) {
console.log(`Result: ${(output as any).result}`);
}
if (typeof output === 'string') {
console.log(`Response: ${output}`);
}
if (Array.isArray(output)) {
output.forEach(item => console.log(item));
}
Typed Output
For better type safety, define your output type:
interface AnalysisResult {
trend: 'increasing' | 'decreasing' | 'stable';
average: number;
insights: string[];
}
const response = await client.agents.invoke('data-analyzer', {
data: [...]
});
const result = response.content as AnalysisResult;
console.log(`Trend: ${result.trend}`);
Timeout Considerations
Invoke requests have a 60-second timeout. For longer tasks:
// Option 1: Use streaming (no timeout)
const stream = await client.agents.invoke('my-agent', {
prompt: 'Long running analysis',
stream: true
});
for await (const chunk of stream) {
process.stdout.write(chunk);
}
// Option 2: Increase client timeout
const client = new Reminix({ timeout: 120000 }); // 2 minutes