MCP Server
The MCP Server is planned for a future release. This documentation describes the intended functionality to help you plan your integration ahead of time.
The Model Context Protocol (MCP) is a standard for AI tools to share context. Cortex will provide an MCP server that enables tools like Cursor, Claude Desktop, and custom apps to share memory.
Result: Tell one tool something, all tools remember it!
Compatible Tools
Cursor
Claude Desktop
Windsurf
Custom Apps
Planned Setup
Install Cortex CLI
$ brew install cortex-memory/tap/cli$ npm install -g @cortexmemory/cliStart MCP server
$ cortex mcp startConfigure Cursor
Add to ~/.cursor/mcp.json:
{
"mcpServers": {
"cortex": {
"command": "cortex",
"args": ["mcp", "serve"]
}
}
}
Restart Cursor
Restart Cursor to load the MCP server.
Install Cortex CLI
$ brew install cortex-memory/tap/cli$ npm install -g @cortexmemory/cliConfigure Claude Desktop
Add to Claude Desktop config:
{
"mcpServers": {
"cortex": {
"command": "cortex",
"args": ["mcp", "serve"]
}
}
}
Config locations: ~/Library/Application Support/Claude/claude_desktop_config.json (macOS), %APPDATA%\Claude\claude_desktop_config.json (Windows), ~/.config/Claude/claude_desktop_config.json (Linux)
Restart Claude Desktop
Restart Claude to load the MCP server.
import axios from 'axios';
const MCP_ENDPOINT = 'http://localhost:3000';
const USER_ID = 'user-123';
// Store memory
async function remember(content: string) {
await axios.post(`${MCP_ENDPOINT}/add_memories`, {
memory: content,
user_id: USER_ID,
});
}
// Retrieve context
async function search(query: string) {
const response = await axios.post(`${MCP_ENDPOINT}/search_memory`, {
query,
user_id: USER_ID,
limit: 5,
});
return response.data.results;
}
How MCP Memory Works
When using MCP, all your AI tools share ONE memory space (Hive Mode). Cursor stores a preference, Claude can read it!
Example Flow
- In Cursor: You say "I prefer TypeScript for backend"
- Cursor → MCP: Stores preference via
add_memories - Later, in Claude: You ask about coding advice
- Claude → MCP: Searches for "coding preferences"
- Claude: Uses that fact to personalize response
Result: Claude knows your preferences without you repeating them!
Planned MCP Endpoints
| Endpoint | Method | Description |
|---|---|---|
cortex_remember / /add_memories | POST | Store a memory |
cortex_search / /search_memory | POST | Search memories |
cortex_recall / /list_memories | GET | Get context for LLM |
cortex_get_facts / /get_memory | GET | Retrieve extracted facts |
/delete_all_memories | POST | Clear user memories |
Store Memory
POST /add_memories
{
"memory": "User prefers dark mode",
"user_id": "user-123",
"metadata": { "source": "cursor", "importance": 70 }
}
Search Memory
POST /search_memory
{
"query": "user preferences",
"user_id": "user-123",
"limit": 5
}
Deployment Options
Run MCP server locally on your machine:
$ cortex mcp startFree, data stays local, requires server running.
Environment Variables:
CONVEX_URL=https://your-project.convex.cloud
MCP_PORT=3000
MCP_LOG_LEVEL=info
Cloud-hosted MCP endpoint—no local server needed:
{
"mcpServers": {
"cortex-cloud": {
"url": "https://mcp.cortex.cloud/v1",
"auth": {
"type": "bearer",
"token": "cortex_sk_your_api_key"
}
}
}
}
- Always available (99.9% uptime)
- Auto-embeddings (no API key needed)
- Auto-fact extraction
- Team sharing
- Analytics dashboard
User Isolation
Each user_id has completely isolated memories. User A cannot see User B's data.
// User A stores preference
POST /add_memories
{ "memory": "I prefer light mode", "user_id": "alice@example.com" }
// User B stores preference
POST /add_memories
{ "memory": "I prefer dark mode", "user_id": "bob@example.com" }
// Searches are isolated
POST /search_memory
{ "query": "theme preference", "user_id": "alice@example.com" }
// Returns: "I prefer light mode" (Alice's only)
Use Cases
Cross-IDE Preferences
1. In Cursor: "I prefer 2-space indentation"
→ Stores in MCP
2. Later, in Windsurf: Ask AI to generate code
→ Queries MCP, gets "2-space indentation" preference
→ Generates code with correct indentation
3. In Claude: "Help me write a config file"
→ Knows your preferences from MCP
Personal Knowledge Base
Day 1 (Cursor): "Working on Project Apollo, a React app"
Day 3 (Claude): "Apollo uses PostgreSQL"
Day 7 (Any tool): "Tell me about Apollo"
→ MCP returns all accumulated facts
→ All tools have complete context