Skip to main content

MCP Server

Planned Feature

The MCP Server is planned for a future release. This documentation describes the intended functionality to help you plan your integration ahead of time.

What is MCP?

The Model Context Protocol (MCP) is a standard for AI tools to share context. Cortex will provide an MCP server that enables tools like Cursor, Claude Desktop, and custom apps to share memory.

Result: Tell one tool something, all tools remember it!


Compatible Tools

Cursor

Claude Desktop

Windsurf

Custom Apps


Planned Setup

1

Install Cortex CLI

Terminal
$ brew install cortex-memory/tap/cli
2

Start MCP server

Terminal
$ cortex mcp start
3

Configure Cursor

Add to ~/.cursor/mcp.json:

{
"mcpServers": {
"cortex": {
"command": "cortex",
"args": ["mcp", "serve"]
}
}
}
4

Restart Cursor

Restart Cursor to load the MCP server.


How MCP Memory Works

Hive Mode in Action

When using MCP, all your AI tools share ONE memory space (Hive Mode). Cursor stores a preference, Claude can read it!

MCP Memory Architecture
AI Tools
Cortex MCP Server
Shared Memory Space

Example Flow

  1. In Cursor: You say "I prefer TypeScript for backend"
  2. Cursor → MCP: Stores preference via add_memories
  3. Later, in Claude: You ask about coding advice
  4. Claude → MCP: Searches for "coding preferences"
  5. Claude: Uses that fact to personalize response

Result: Claude knows your preferences without you repeating them!


Planned MCP Endpoints

EndpointMethodDescription
cortex_remember / /add_memoriesPOSTStore a memory
cortex_search / /search_memoryPOSTSearch memories
cortex_recall / /list_memoriesGETGet context for LLM
cortex_get_facts / /get_memoryGETRetrieve extracted facts
/delete_all_memoriesPOSTClear user memories

Store Memory

POST /add_memories
{
"memory": "User prefers dark mode",
"user_id": "user-123",
"metadata": { "source": "cursor", "importance": 70 }
}

Search Memory

POST /search_memory
{
"query": "user preferences",
"user_id": "user-123",
"limit": 5
}

Deployment Options

Run MCP server locally on your machine:

Terminal
$ cortex mcp start
Tip

Free, data stays local, requires server running.

Environment Variables:

CONVEX_URL=https://your-project.convex.cloud
MCP_PORT=3000
MCP_LOG_LEVEL=info

User Isolation

Automatic Isolation

Each user_id has completely isolated memories. User A cannot see User B's data.

// User A stores preference
POST /add_memories
{ "memory": "I prefer light mode", "user_id": "alice@example.com" }

// User B stores preference
POST /add_memories
{ "memory": "I prefer dark mode", "user_id": "bob@example.com" }

// Searches are isolated
POST /search_memory
{ "query": "theme preference", "user_id": "alice@example.com" }
// Returns: "I prefer light mode" (Alice's only)

Use Cases

Cross-IDE Preferences

1. In Cursor: "I prefer 2-space indentation"
→ Stores in MCP

2. Later, in Windsurf: Ask AI to generate code
→ Queries MCP, gets "2-space indentation" preference
→ Generates code with correct indentation

3. In Claude: "Help me write a config file"
→ Knows your preferences from MCP

Personal Knowledge Base

Day 1 (Cursor): "Working on Project Apollo, a React app"
Day 3 (Claude): "Apollo uses PostgreSQL"
Day 7 (Any tool): "Tell me about Apollo"
→ MCP returns all accumulated facts
→ All tools have complete context