Universal AI Memory with MCP Integration

Transform any AI assistant into a memory-enabled powerhouse. 16 MCP tools for storing conversations, documents, and knowledge with semantic search and cross-session persistence.

Quick Start

npx columnist-db-mcp
// Or install globally
npm install -g columnist-db-mcp

Works with

Claude DesktopClineAny MCP Client
◇ Claude Desktop Configuration
// ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"columnist-db": {
"command": "npx",
"args": ["columnist-db-mcp"]
}
}
}
// Restart Claude Desktop
// MCP tools will appear automatically

16 MCP Tools for AI Memory

Complete toolkit for managing memories, conversations, and knowledge bases with semantic search

create_memory

Store new memories with content, embeddings, tags, and metadata for semantic search

get_memory

Retrieve specific memories by ID with full content and metadata

update_memory

Update existing memories with new content, tags, or embeddings

delete_memory

Remove memories from the database permanently

list_memories

Browse all memories with pagination and filtering options

search_memories

Hybrid search combining full-text and vector similarity

vector_search

Find semantically similar memories using cosine similarity

text_search

Full-text search with relevance scoring and field filtering

list_tags

Get all tags used across memories with usage counts

get_by_tag

Filter memories by specific tags or tag combinations

store_conversation

Save entire conversation threads with automatic embedding generation

get_conversation

Retrieve complete conversation history by ID

create_document

Store documents with chunking and automatic embedding generation

search_documents

Search across document chunks with semantic matching

get_stats

Database statistics including memory counts and storage usage

export_memories

Export memories to JSON format for backup or migration

How It Works

Columnist-DB MCP Server provides a universal memory layer for any AI assistant that supports the Model Context Protocol. Once configured, AI assistants can automatically store and recall information across sessions.

1. Install and Configure

Add the MCP server to your AI assistant's configuration file. The server runs locally and stores all data in IndexedDB.

2. Automatic Tool Access

Once configured, the AI assistant automatically gets access to all 16 memory management tools without any additional setup.

3. Persistent Memory

The AI can store conversations, documents, and knowledge that persists across sessions. Search uses vector embeddings for semantic matching.

◇ Example: Storing a Memory
// AI assistant automatically uses MCP tools
create_memory({
content: "User prefers dark mode",
tags: ["preferences", "ui"],
embeddings: [0.1, 0.2, ...]
});
◇ Example: Searching Memory
// Semantic search across all memories
search_memories({
query: "user interface preferences",
limit: 5
});
// Returns: "User prefers dark mode"
◇ Example: Tag-based Filtering
// Get all memories with specific tags
get_by_tag({
tags: ["preferences"]
});

Universal Compatibility

Works with Claude Desktop, Cline, and any other MCP-compatible AI assistant. No vendor lock-in.

Local-First Storage

All data stays on your machine. No cloud dependencies, no privacy concerns. Full offline capability with IndexedDB.

Semantic Search

Vector embeddings enable semantic matching. Find relevant memories even when exact keywords don't match.