Skip to main content

Five-Minute Quickstart

The Cortex CLI guides you through setting up a complete AI agent with persistent memory. This guide walks you through every option in the cortex init wizard.

Before You Begin

Prerequisites: Node.js 20+, npm


Step 1: Install the CLI

Terminal
$ brew install cortex-memory/tap/cli

Verify installation:

Terminal
$ cortex --version
Expected

Version 0.27.4 or higher


Step 2: Run the Init Wizard

Terminal
$ cortex init

Or specify a directory:

Terminal
$ cortex init my-agent

The wizard walks you through six configuration steps. Let's explore each one.


Init Wizard Options

Option 1: Project Name

Terminal
$ ? Project name: my-cortex-agent
  • Requirements: Lowercase letters, numbers, hyphens, underscores only
  • Default: my-cortex-agent
  • Creates: Directory with this name in current folder
Existing Directory?

If the directory exists and has files, the wizard asks: "Add Cortex to existing project?" This lets you add Cortex to an existing codebase without overwriting your files.


Option 2: Convex Database Setup

Terminal
$ ? How would you like to set up Convex?
❯ Local development (fast, recommended)
Create new cloud project
Use existing Convex project

This is the most important choice. Cortex uses Convex as its backend database.

What it does:

  • Starts a local Convex server on http://127.0.0.1:3210
  • No account required
  • Data stored locally
  • Full feature support including vector search

Best for:

  • Learning Cortex
  • Local development
  • Offline work
  • Quick prototyping

Requires: Nothing extra—works immediately

What it does:

  • Creates a new project on Convex Cloud
  • Deploys your backend to https://[project].convex.cloud
  • Persistent data across machines
  • Automatic scaling

Best for:

  • Production deployments
  • Team collaboration
  • Sharing your agent

Requires: Free Convex account (wizard prompts login if needed)

What it does:

  • Connects to a Convex project you already have
  • Deploys Cortex functions to your existing backend

Best for:

  • Adding Cortex to an existing Convex app
  • Migrating from another setup

Requires: Existing Convex project name

Can I Change This Later?

Yes! Use cortex config add-deployment to add more environments (local, staging, production) anytime.


Option 3: Graph Database (Optional)

Terminal
$ ? Enable graph database integration? (Y/n)

Graph databases enable relationship queries and knowledge graphs—powerful for complex AI agents.

What it is: Enterprise-grade graph database with Cypher query language

Best for:

  • Production deployments
  • Complex relationship queries
  • Large-scale knowledge graphs

Local setup: Docker container on bolt://localhost:7687

Cloud option: Use Neo4j Aura or any Neo4j instance

What it is: In-memory graph database optimized for real-time analytics

Best for:

  • High-throughput applications
  • Real-time relationship queries
  • Analytics workloads

Local setup: Docker container on bolt://localhost:7687

You can always add graph integration later. Cortex works great without it—the graph layer adds advanced relationship queries on top of the core memory features.

Docker Required

Local graph databases require Docker Desktop. The wizard checks for Docker and shows installation instructions if missing.


Option 4: OpenAI API Key (Optional)

Terminal
$ ? Configure OpenAI API key now? (y/N)
  • Used for: AI-powered embeddings and fact extraction
  • Required?: No—you can add it to .env.local later
  • Get a key: platform.openai.com/api-keys
Bring Your Own Embeddings

Cortex is embedding-agnostic. You can use OpenAI, or provide your own embeddings from any provider (Cohere, Voyage, local models, etc.).


Option 5: CLI Scripts

Terminal
$ ? Add Cortex CLI scripts to package.json? (Y/n)

Adds convenient npm scripts:

{
"scripts": {
"cortex": "cortex",
"cortex:setup": "cortex setup",
"cortex:stats": "cortex db stats",
"cortex:spaces": "cortex spaces list",
"cortex:status": "cortex status"
}
}

Option 6: Vercel AI Quickstart (Optional)

Terminal
$ ? Install Vercel AI quickstart demo? (y/N)

The basic template is always installed—it includes a full CLI and HTTP API for testing Cortex. The Vercel AI Quickstart is an optional addition that gives you a web-based chat UI.

TemplateInstalledWhat You Get
Basic TemplateAlwaysInteractive CLI + HTTP API server
Vercel AI QuickstartOptionalNext.js ChatGPT-style web app
Both Templates Are Fully Functional

Either template lets you immediately start chatting and see Cortex memory in action. Choose based on whether you prefer terminal or browser.


Configuration Summary

Before proceeding, the wizard shows a summary:

Terminal
$ Configuration Summary
──────────────────────────────────────────────
Project:     my-cortex-agent
Location:    /Users/you/my-cortex-agent
Type:        New project
Convex:      Local development
Graph DB:    neo4j
OpenAI:      Configured
CLI Scripts: Yes
──────────────────────────────────────────────
? Proceed with setup? (Y/n)

What Gets Created

After confirmation, the wizard:

  1. Creates project files - package.json, TypeScript config, basic agent template
  2. Installs dependencies - @cortexmemory/sdk, convex, and related packages
  3. Sets up Cortex backend - Deploys Convex functions for memory operations
  4. Creates .env.local - Environment variables for your configuration
  5. Configures graph database - Docker compose file and env vars (if enabled)
  6. Saves to ~/.cortexrc - Persistent CLI configuration

Step 3: Start Development

After setup, start your services:

Terminal
$ cortex start

Or use interactive mode with live dashboard:

Terminal
$ cortex dev
Interactive Dev Mode

cortex dev provides a live dashboard with keyboard shortcuts:

  • s - Show status
  • r - Restart services
  • k - Kill stuck ports
  • q - Quit

Step 4: Start Using Cortex

Choose Your Interface

Basic Template — CLI + HTTP API (always installed) | Vercel AI Quickstart — Web chat UI (optional)

The basic template gives you an interactive CLI and HTTP API to explore Cortex immediately.

Chat via CLI

Terminal
$ npm start

You'll see an interactive prompt where you can chat and watch memory orchestration in real-time:

You: Hi, I'm Alex. I work as a software engineer.

┌────────────────────────────────────────────────────────────────────┐
│ MEMORY ORCHESTRATION │
├────────────────────────────────────────────────────────────────────┤
│ 📦 Memory Space ✓ complete (2ms) │
│ 👤 User ✓ complete (5ms) │
│ 🤖 Agent ✓ complete (3ms) │
│ 💬 Conversation ✓ complete (8ms) │
│ 🎯 Vector Store ✓ complete (45ms) │
│ 💡 Facts ✓ complete [NEW] (120ms) │
│ → "User's name is Alex" (identity, 95%) │
│ → "User works as software engineer" (employment, 90%) │
└────────────────────────────────────────────────────────────────────┘

CLI Commands:

CommandDescription
/recall <query>Search memories without storing
/factsList all extracted facts
/historyShow conversation history
/newStart a new conversation
/configShow current configuration
/exitExit the demo

Or Use the HTTP API

Terminal
$ npm run server

Then send requests from another terminal:

curl -X POST http://localhost:3001/chat \
-H "Content-Type: application/json" \
-d '{"message": "Hi, I am Alex. I work as a software engineer."}'

API Endpoints:

EndpointMethodDescription
/chatPOSTChat and store memory
/recall?query=...GETSearch memories
/factsGETList extracted facts
/history/:idGETGet conversation history
/healthGETHealth check

CLI Flags for Non-Interactive Setup

Skip the wizard with flags:

Terminal
$ cortex init my-agent --local --skip-graph -y
FlagDescription
--localUse local Convex (no prompts)
--cloudUse cloud Convex (no prompts)
--skip-graphSkip graph database setup
-t, --template <name>Template to use (default: basic)
-y, --yesSkip all confirmation prompts
--startStart services after setup

Explore Your Data

Convex Dashboard

Terminal
$ cortex convex dashboard

Opens the correct dashboard for your deployment (local or cloud).

TableDescription
conversationsAll conversation threads
memoriesSearchable memory index
immutableVersioned message history
usersUser profiles
memorySpacesMemory space registry

CLI Commands

Terminal
$ cortex db stats
Terminal
$ cortex spaces list
Terminal
$ cortex memory list --space my-agent

Troubleshooting

Terminal
$ cortex config test

Ensure Convex is running and CONVEX_URL is set in .env.local.

Use cortex dev then press k to kill stuck processes, or:

Terminal
$ cortex stop

Install Docker Desktop from docker.com, or choose "Cloud/Existing instance" for the graph database.

The wizard handles authentication automatically. If you see auth errors:

Terminal
$ npx convex logout
Terminal
$ npx convex login

Then run cortex init again.


What's Next?