πŸš€Founding Member:First 1,000 users lock in50% off for lifeGet Started
Persistent memory for AI tools β€” works with Claude, Cursor, Windsurf, VS Code & beyond

Stop re-explaining your project to every AI tool.

ContextStream remembers what your AI tools forget: decisions, context, and the reasoning behind them. That memory is shared across tools and persists across sessionsβ€”no more re-explaining yourself. It retrieves only relevant context for each conversation, keeping prompts lean. For code, it goes deeper with semantic search and dependency analysis.

  • Shared memory across tools that doesn't reset between chats
  • Context pack: relevant code/docs/decisions only (no copy-pasting, no bloated chat history)
  • Dependency + impact analysis on demand ("what breaks if I change this?")
or

3,000 free operations β€’ No credit card required

Need recurring usage + no-expiry memory? Explore Pro/Elite

Setup in <2 minutes β€’ Founding members get 50% off for life (first 1,000)

$ npx -y @contextstream/mcp-server setup

One command: authenticate, create an API key, and write the correct MCP config for your tool.

Local MCP server + transparent data flow. Common dependency/build directories and large files are skipped by default; data is encrypted.

See the difference

Watch how ContextStream transforms your AI workflow

Left: Manual workarounds, scattered notes. Right: Automatic memory that just works.

Code Search

Search That Actually Works

Five intelligent search modes, each optimized for different tasks. Find code by meaning, by symbol, by pattern β€” and save tokens doing it.

Semantic
"how is authentication handled"

Finds actual auth logic, middleware, and session handling

Understands meaning, not just keywords

Hybrid
"rate limiting implementation"

Combines meaning + keywords for best of both

Most accurate for general questions

Symbol
"UserAuthService"

Exact class/function matches with context

Perfect for jumping to definitions

Refactor
"oldFunctionName"

All references with word-boundary precision

Safe renames, no false positives

Pattern
"*.test.ts"

All matching files by glob pattern

Find files by naming convention

Exhaustive
"TODO"

Every single match, like grep

When you need ALL occurrences

Smart Token Savings

Search automatically suggests the most token-efficient response format based on your query. Get exactly what you need without context bloat.

Full

Complete code context

Best for: Understanding code

Minimal
up to 60% savings

Path + line + snippet

Best for: Symbol lookups

Count
up to 90% savings

Just the number

Best for: 'How many TODOs?'

More Accurate

Finds the right code on the first try

Faster Answers

One search instead of file-by-file scanning

Lower Cost

Smaller context means fewer tokens spent

How It Works

Local MCP server - encrypted index storage

Built with security and privacy in mind from the beginning. During indexing, the local MCP server sends file content to build embeddings and analyze dependencies. We store an encrypted index (embeddings + metadata + limited file content for search/browsing). Full graph tier plans (Elite/Team/Enterprise) can store full file content in encrypted object storage when enabled.

Your IDE/Terminal

Cursor, Claude Code, Codex, Windsurf, VS Code

MCP Server

Runs locally on your machine

ContextStream API

Secure and Encrypted

Vector Storage

Embeddings + metadata stored

What we securely store: Encrypted index data (embeddings + metadata + limited file content for search/browsing), dependency graphs, code quality metrics, decision tracking, lessons learned, and memory events. Full file content is stored only for Full graph tier plans (Elite/Team/Enterprise) when enabled in encrypted object storage. What we automatically exclude: Common dependency/build directories, lockfiles, and large files by default. We only index common code/config/doc file types.
What gets sent over the wire:

β€’ Indexing: File content is sent for embedding generation, code analysis, and search indexing.

β€’ Queries: Your search query is sent; matching code and context are returned from the stored index.

β€’ Excluded automatically: common dependency/build directories, lockfiles, and large files; only common code/config/doc file types are indexed.

But that's just memory...

That's just the beginning

Here are a few other powerful things you can do:

Impact Analysis

"What breaks if I change the UserService class?"

See all dependencies and side effects before refactoring.

Decision History

"Why did we choose PostgreSQL over MongoDB?"

Recall past decisions with full context and reasoning.

Semantic Code Search

"Find where we handle rate limiting"

Search by meaning, not just keywords. Find code by intent.

Knowledge Graph

Pro Graph-Lite
Elite Full Graph

"Show dependencies for the auth module"

Pro includes Graph-Lite for module-level links; Elite unlocks full graph layers.

Lessons Learned

New

"NO! You pushed without running tests again!"

β†’ AI captures the mistake and never repeats it

When you correct your AI, it remembers. Mistakes are captured automatically and surfaced in future sessions to prevent repeating the same errors.

Token-optimized MCP toolset (v0.4.x)

All ContextStream capabilities via ~11 consolidated domain tools β€” no tool-registry bloat. Index projects, track decisions, analyze dependencies, search semantically β€” plus GitHub, Slack, and Notion integrations.

session_init
context_smart
search
session
memory
graph
project
workspace
integration
+ more

"But don't AI coding tools already have memory?"

Built-in memory is limited:

  • βœ—Vendor lock-in β€” switch tools, lose everything
  • βœ—Expires or resets β€” context vanishes over time
  • βœ—No semantic search β€” can't find past conversations
  • βœ—Personal only β€” teammates start from zero
  • βœ—No API access β€” can't automate or integrate
  • βœ—Memory isolated from code β€” decisions aren't linked to codebase
  • βœ—Clunky to use β€” no simple way to save or retrieve context

ContextStream is different:

  • βœ“Universal β€” works with Cursor, Claude, Windsurf, any MCP tool
  • βœ“Persistent forever β€” never lose context (paid plans)
  • βœ“Semantic search β€” find anything across all history
  • βœ“Team memory β€” shared context, instant onboarding
  • βœ“Full API access β€” token-optimized MCP (no tool-registry bloat)
  • βœ“Knowledge graph β€” decisions linked to code, impact analysis
  • βœ“Natural language β€” "remember X", "what did we decide about Y?"

"What about self-hosted memory tools?"

Self-hosted means self-managed:

  • βœ—Docker, servers, maintenance β€” you're the ops team now
  • βœ—You manage uptime β€” backups, scaling, monitoring on you
  • βœ—Just memory storage β€” no code intelligence or impact analysis
  • βœ—Basic key-value β€” no knowledge graph or relationships
  • βœ—Limited API surface β€” fewer tools, less automation
  • βœ—Team features = extra work β€” auth, permissions, sharing

ContextStream handles it all:

  • βœ“2-minute setup β€” no Docker, no servers, no devops
  • βœ“We handle infrastructure β€” uptime, backups, scaling included
  • βœ“Code intelligence β€” understands your architecture and decisions
  • βœ“Knowledge graph β€” linked memory with impact analysis
  • βœ“Consolidated MCP tools β€” full API, deep automation
  • βœ“Teams built-in β€” workspaces, sharing, instant onboarding
Quick Setup

See the magic in your next coding session

Three steps. Copy-paste config. Ask one question. Watch your AI actually remember.

1

Get your API key (30 seconds)

Sign up, go to Settings β†’ API Keys, create one.

2

Add to your MCP config (60 seconds)

Paste this into your Cursor/Claude MCP settings:

{
  "mcpServers": {
    "contextstream": {
      "command": "npx",
      "args": ["-y", "@contextstream/mcp-server"],
      "env": {
        "CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
        "CONTEXTSTREAM_API_KEY": "your-key-here"
      }
    }
  }
}
3

Try it out (30 seconds)

Open Cursor or Claude, start a new chat, and type:

"Initialize session and remember that I prefer TypeScript with strict mode, and we use PostgreSQL for this project."

Then start a brand new conversation and ask:

"What are my preferences for this project?"

It remembers. Across sessions. Across tools. Forever.

The difference is night and day

See how ContextStream transforms your AI workflow

Without ContextStream

You explain the auth system. Close chat. Open new chat. Explain auth again.

With ContextStream

AI recalls your auth decisions from last month β€” JWT choice, refresh token strategy, everything.

Without ContextStream

Switch from Cursor to Claude. Lose all context. Start over.

With ContextStream

Same memory everywhere. Cursor, Claude, Windsurf β€” your AI knows you.

Without ContextStream

New team member joins. Days of onboarding conversations.

With ContextStream

Shared workspace memory. New hires get context from day one.

Without ContextStream

"Why did we build it this way?" No one remembers.

With ContextStream

Decisions linked to code. Ask why, get the reasoning and the commit.

Dashboard

Gain Visibility into Your Codebase

Track token savings and estimated AI cost saved, plus indexing status, language distribution, and memory usage. See exactly what your AI knows β€” and what it's saving you.

ContextStream Dashboard - Workspaces, Projects, Files Indexed, and Credits overview
Real-time
Indexing progress & status
Visualize
Language distribution & activity
Track
Memory events & decisions
Early Access

Join developers who are done repeating themselves

ContextStream is in active development with real users. Early adopters get direct access to the team and influence on the roadmap.

~75%
Fewer tool tokens vs. legacy
∞
No automatic expiry (paid plans)
<2min
Setup time

"I was tired of explaining our auth setup to Claude every session. Now I just say 'init session' and it already knows."

β€” Early access feedback

Beyond Memory

Memory is just the start

ContextStream also understands your code β€” dependencies, impact, architecture, lessons learned, and insights captured.

Persistent Memory

Store decisions, preferences, and context. Searchable across sessions and tools.

Semantic Search

Find code by what it does, not just keywords.

Dependency Analysis

Pro: Graph-Lite
Elite/Team: Full Graph

Know what depends on what before you refactor.

Knowledge Graphs

Pro: Graph-Lite
Elite/Team: Full Graph

Connect decisions to code, docs to features.

Lessons Learned

Capture mistakes, never repeat them again.

Auto-Session Init

AI gets context automatically. No copy-pasting.

Token Savings Dashboard

Track estimated tokens saved vs default tools (Glob/Grep/Search/Read), plus calls avoided and trends.

Private & Secure

Encrypted at rest, never used for training.

β€œ

I built ContextStream because I was tired of my AI forgetting everything.

As a software engineer, I wanted more than code memory β€” I wanted an AI brain that follows me everywhere. One that remembers my decisions across Cursor, Claude, Windsurf, and every tool I use.

Now it does. And it's been a game-changer for how I work.

Privacy First

Your Code Stays Yours

Your data is encrypted, never used for training, and you control who has access.

Encrypted at Rest

AES-256 encryption for all stored data

No Training on Your Data

We never use your code to train AI

You Control Access

Workspace permissions & API keys

Delete Anytime

Full control to delete your data instantly

Technical Security Specifications

TLS 1.3 for all data in transit
AES-256 encryption at rest
Infrastructure that scales
Strong security practices
Encrypted index + limited file content (full content for Full graph tier plans when enabled)
API key authentication for all requests

Frequently Asked Questions

Everything you need to know about ContextStream

Stop repeating yourself.

Setup takes 2 minutes. Your AI remembers across sessions and tools.

Sign up for Pro

25,000 operations/month Β· No automatic expiry on Pro plans