Fformulate
VS Code Extension · Tooling layer

Your AI context, everywhere you code.

A shared memory layer across your AI coding tools. Store context once in VS Code — access it in Cursor, Windsurf, and every AI assistant you use.

terminal
# 1. install server
$pip install continuum-context-hub
# 2. start server
$continuum serve
# 3. connect VS Code
$Cmd+Shift+P → "Continuum: Connect"
✓ Connected to Continuum v0.2.4

One memory layer. Every tool.

Continuum runs a local server that acts as shared context storage. Any connected IDE reads and writes to the same memory.

VS Code
Cursor
Windsurf
───▶
LOCAL SERVER
continuum serve
localhost:8000
───▶
SHARED MEMORY
Project context
per-project · local

Built for developers who think in systems.

Shared memory across tools

Memories stored from VS Code are instantly available in Cursor, Windsurf, or any Continuum-connected IDE. Switch tools without losing your project context.

Semantic categories + importance

Every memory gets a category (architecture, conventions, patterns, decisions) and importance level — so AI agents always prioritize the most critical context.

Nine commands, zero friction

Remember, Recall, Sync, Push Selection, Project Context — all from the Command Palette. Continuum fits inside the way you already work.

COMMANDS
Continuum: ConnectCheck connection to the server
Continuum: RememberStore a new memory with category + importance
Continuum: RecallSearch memories by semantic query
Continuum: Push SelectionPush selected text as a memory
Continuum: Project ContextView all project memories as a document
Continuum: SyncRefresh and display full project context

Three steps to shared context.

Install the server, start it, connect VS Code. You're done.

01Install the serverpip install continuum-context-hub
02Start the servercontinuum serve
03Connect from VS CodeCmd+Shift+P → "Continuum: Connect"