Skip to main content

Conversation History

Layer 1a: Conversations

Conversation History is the ACID source of truth for all message data. Every message is immutably stored—append-only, never modified or deleted—providing complete audit trails.


Quick Start

// Create a conversation
const conv = await cortex.conversations.create({
type: 'user-agent',
memorySpaceId: 'user-123-space',
participants: {
user: { id: 'user-123', name: 'Alice' },
agent: { id: 'assistant' },
},
});

// Add messages (immutable, append-only)
await cortex.conversations.addMessage({
conversationId: conv.id,
message: {
role: 'user',
content: 'Hello!',
userId: 'user-123',
},
});

await cortex.conversations.addMessage({
conversationId: conv.id,
message: {
role: 'agent',
content: 'Hi there! How can I help?',
},
});

// Retrieve history
const history = await cortex.conversations.getHistory(conv.id, {
limit: 50,
order: 'desc',
});
// history.messages contains the array of messages

Where Conversations Fit

Conversations live in Layer 1a of Cortex's 4-layer architecture. They work together with Layer 2 (Vector) for efficient retrieval:

FeatureLayer 1a: ACID ConversationsLayer 2: Vector Index
PurposeSource of truthSearchable index
MutabilityImmutable (append-only)Versioned with updates
RetentionForever (compliance)Configurable limits
SearchChronological retrievalSemantic search
Use CaseAudit trails, replayFast context building
The Link Between Layers

Vector memories reference ACID conversations via conversationRef. You can always trace back from any memory to its original conversation.


Conversation Operations

// User-Agent conversation
const conv = await cortex.conversations.create({
type: 'user-agent',
memorySpaceId: 'user-123-space',
participants: {
user: { id: 'user-123', name: 'Alice' },
agent: { id: 'assistant' },
},
metadata: { channel: 'web' },
});

// Agent-Agent conversation (A2A) - manual approach
// Note: cortex.a2a.send() handles this automatically (recommended)
const a2aConv = await cortex.conversations.create({
type: 'agent-agent',
participants: {
agent1: 'finance-agent',
agent2: 'hr-agent',
},
});

Message Schema

ParameterTypeRequiredDefaultDescription
rolestringYes'user' | 'agent' | 'system'
contentstringYesMessage content
userIdstringNoUser who sent (for user messages)
timestampDateNoAuto-set if not provided
metadataobjectNoCustom message data (channel, device, etc.)

Conversation Types

Standard chat between user and AI:

{
conversationId: 'conv-456',
type: 'user-agent',
participants: {
userId: 'user-123',
memorySpaceId: 'agent-1'
},
messages: [
{ id: 'msg-001', role: 'user', content: '...', timestamp: T1 },
{ id: 'msg-002', role: 'agent', content: '...', timestamp: T2 }
]
}

Immutable Audit Trail

ACID Properties
  • Atomicity: Message fully stored or not at all
  • Consistency: Conversation always in valid state
  • Isolation: Concurrent messages don't interfere
  • Durability: Once stored, never lost
Compliance Benefits

Messages are never modified or deleted from Layer 1a. Vector memories (Layer 2) can be deleted while preserving the ACID conversation history for compliance.


Streaming Support

For Streaming LLM Responses

Use rememberStream() which handles progressive storage automatically:

import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';

const response = await streamText({
model: openai('gpt-4'),
messages: [{ role: 'user', content: 'What is AI?' }],
});

// Automatically buffers and stores
const result = await cortex.memory.rememberStream({
memorySpaceId: 'my-agent',
conversationId: conv.id,
userMessage: 'What is AI?',
responseStream: response.textStream,
userId: 'user-123',
userName: 'Alice',
});

console.log(result.fullResponse);

See Streaming Support for full documentation.


Building Context

Recent Context (Chronological)

async function buildRecentContext(conversationId: string) {
const history = await cortex.conversations.getHistory(conversationId, {
limit: 10,
});

return history.messages
.map(msg => `${msg.role === 'user' ? 'User' : 'Assistant'}: ${msg.content}`)
.join('\n');
}

Relevant Context (Semantic)

async function buildRelevantContext(memorySpaceId: string, query: string) {
// Search vector memories
const memories = await cortex.memory.search(memorySpaceId, query, {
embedding: await embed(query),
limit: 5,
});

// Retrieve full conversations via conversationRef
for (const memory of memories) {
if (memory.conversationRef) {
const conv = await cortex.conversations.get(
memory.conversationRef.conversationId
);
// Access full context...
}
}
}

Common Patterns

Multi-Session Continuity

async function handleReturningUser(userId: string, message: string) {
// Find recent conversations
const convs = await cortex.conversations.list({
'participants.userId': userId,
sortBy: 'lastMessageAt',
sortOrder: 'desc',
limit: 1,
});

const lastConv = convs[0];
const timeSinceLast = lastConv
? Date.now() - lastConv.lastMessageAt.getTime()
: Infinity;

// Continue or start new
const conversationId = timeSinceLast < 30 * 60 * 1000
? lastConv.id
: (await cortex.conversations.create({
type: 'user-agent',
participants: { userId, agentId: 'assistant' },
})).id;

await cortex.conversations.addMessage({
conversationId,
message: {
role: 'user',
content: message,
userId,
},
});

return conversationId;
}

Context Window Management

async function buildContextWindow(conversationId: string, maxTokens: number) {
const history = await cortex.conversations.getHistory(conversationId);
const messages = history.messages;

const contextMessages = [];
let tokenCount = 0;

// Build from newest to oldest
for (let i = messages.length - 1; i >= 0; i--) {
const tokens = estimateTokens(messages[i].content);
if (tokenCount + tokens > maxTokens) break;
contextMessages.unshift(messages[i]);
tokenCount += tokens;
}

return contextMessages;
}

Best Practices

Store Complete Metadata
await cortex.conversations.addMessage({
conversationId: conv.id,
message: {
role: 'user',
content: message,
userId,
metadata: {
channel: 'web',
device: 'desktop',
sessionId: 'session-123',
},
},
});
Redact Before Storing
// ACID is immutable - redact BEFORE storing
function redact(text: string) {
return text
.replace(/\b\d{3}-\d{2}-\d{4}\b/g, '[SSN REDACTED]')
.replace(/\b\d{16}\b/g, '[CARD REDACTED]');
}

await cortex.conversations.addMessage({
conversationId: conv.id,
message: {
role: 'user',
content: redact(userMessage),
userId,
},
});

</Callout>

<Callout type="tip" title="Summarize Long Conversations">

```typescript
// For long conversations, store summaries as vector memories
if (conversation.messageCount > 100) {
const summary = await summarize(olderMessages);

await cortex.memory.remember({
memorySpaceId,
content: `Summary: ${summary}`,
conversationRef: { conversationId, messageIds: [] },
});
}

Next Steps