Introduction
You've heard Manticore Search is fast. You've heard it handles full-text, vector, and fuzzy search in one engine. But when you sit down to actually use it, you're staring at documentation, guessing at SQL syntax, and hoping your CREATE TABLE doesn't throw an obscure error.
MCP-Manticore changes the game. It's a Model Context Protocol (MCP) server that connects Cursor, Claude Code, Codex CLI, or any MCP-compatible AI assistant directly to your Manticore instance. The AI can read the docs, inspect your schema, and execute queries—all before it writes a single query for you.
MCP (Model Context Protocol) is an open standard that lets AI assistants connect to external tools and data sources. Instead of the AI hallucinating Manticore syntax based on training data from who-knows-when, it gets real-time access to your database and the official documentation.
Two Ways This Helps You
Depending on what you're doing, MCP-Manticore provides value in two different ways:
1. Developer Assistance (Primary): You're building an app that uses Manticore. The AI helps you create tables, write complex queries, and understand features—without you memorizing SQL syntax.
2. Data Exploration (Secondary): You have data in Manticore and want to ask questions in plain English. "Show me the cheapest accessories" or "Find products similar to this one." For complex operations like vector search or fuzzy matching, the AI needs MCP to know the correct Manticore-specific syntax.
Let's look at both.
The Problem: AI Without Context
When you ask an AI assistant to help with Manticore without MCP integration, here's what happens:
Example: Creating a Table with Auto-Embeddings
You: "How do I create a Manticore table with auto-embeddings for semantic search?"
AI (guessing):
CREATE TABLE products (
title text,
description text,
description_vector float_vector
embedding='sentence-transformers/all-MiniLM-L6-v2'
source='description'
);
Looks reasonable, right? Wrong. The actual syntax uses MODEL_NAME and FROM, not embedding and source. You'd discover this only after the query fails, sending you back to the documentation.
This is the pattern:
- AI generates query based on outdated or guessed syntax
- You run it, get an error
- You manually check documentation
- You correct the AI (or fix it yourself)
- Repeat until it works
For developers integrating Manticore into applications, this friction kills productivity. You're not exploring data—you're fighting syntax.
The Solution: MCP-Manticore
With MCP-Manticore enabled, the conversation changes:
You: "How do I create a Manticore table with auto-embeddings for semantic search?"
AI: "Let me check the documentation for the correct syntax..."
[AI calls list_documentation via MCP, searches for "embedding|semantic|knn|vector"]
AI: "Here's the correct syntax. Manticore uses MODEL_NAME and FROM for auto-embeddings:"
CREATE TABLE products (
title TEXT,
description TEXT,
embedding_vector FLOAT_VECTOR KNN_TYPE='hnsw' HNSW_SIMILARITY='l2'
MODEL_NAME='sentence-transformers/all-MiniLM-L6-v2' FROM='title,description'
);
The AI now has:
- Real-time access to Manticore documentation
- Schema introspection via
list_tables()anddescribe_table() - Query execution to test and validate
- Safety controls—read-only by default, write operations require opt-in
Real Examples: With and Without MCP
Example 1: Schema Creation
Without MCP:
The AI guessed the syntax, using embedding='...' and source='...'—which doesn't exist in Manticore. You'd hit an error and waste time debugging.
With MCP:
The AI retrieved the official documentation first and provided the correct MODEL_NAME and FROM syntax. It also explained the supported models (local HuggingFace models, OpenAI, Voyage, Jina) and the HNSW_SIMILARITY options (L2, IP, COSINE).
Example 2: Semantic Search with Auto-Embeddings
You: "Find products similar to 'noise-canceling headphones for travel'"
Without MCP:
The AI completely loses track. Without access to documentation, it:
- Tries to SELECT all data and aggregate internally without any filter
- Hallucinates embedding vectors with made-up syntax:
ANY_KNN(embedding, (-0.07089090,0.04201586,-0.03262700...)) - Attempts to write Python scripts to manually calculate similarity
- Eventually gives up and just does string matching on descriptions
Result: It finds "Wireless Headphones" only because the description literally contains "noise-canceling headphones" — pure luck, not semantic search.
With MCP:
The AI checks documentation, discovers your table uses auto-embeddings, and learns that knn() accepts text directly when MODEL_NAME is configured:
SELECT id, name, description, knn_dist()
FROM products
WHERE knn(embedding, 5, 'noise-canceling headphones for travel');
Result: Returns Wireless Headphones as #1 (correct), but also surfaces semantically related items — actual vector similarity, not keyword matching.
Example 3: Fuzzy Search (Typo Tolerance)
You: "Find products even if I misspell the name, like 'headphons' instead of 'headphones'"
Without MCP:
The AI tries everything it was trained on, hoping something works:
MATCH('headphons~1')andMATCH('headphons~')— wrong operatorsCALL SUGGEST('headphons', 'products')— wrong approach for this use caseMATCH('FUZZY(headphons')— hallucinated syntax that doesn't existALTER TABLE products SET min_infix_len = 3— unnecessary and wrongOPTION expand_keywords = 1— unrelated feature
It even tried to optimize the table and run suggestions again. Complete chaos.
Result: No working query. Just a pile of failed attempts based on outdated or confused training data.
With MCP:
The AI checks the documentation and finds the correct syntax immediately:
SELECT * FROM products WHERE MATCH('headphons') OPTION fuzzy=1;
Result: Returns "Wireless Headphones" despite the typo. The AI also explains that fuzzy=1 allows Levenshtein distance of 1 (one character difference), and you can adjust tolerance with OPTION fuzzy=1, distance=2 for more flexibility.
Key Features
Intelligent Documentation Lookup
MCP-Manticore includes a documentation fetcher that pulls directly from the Manticore Search manual on GitHub. When you ask about features like KNN vector search, fuzzy matching, or full-text operators, the AI retrieves the official documentation before responding.
Schema-Aware Query Building
The server provides tools that let the AI understand your data structure before writing queries:
list_tables()— See what tables existdescribe_table()— Understand column names and typesexecute_query()— Run queries and see results
Safe Query Execution
By default, MCP-Manticore runs in read-only mode. Write operations (INSERT, UPDATE, DELETE, DROP) require explicit opt-in via environment variables:
export MANTICORE_ALLOW_WRITE_ACCESS=true # Enable INSERT/UPDATE/DELETE
export MANTICORE_ALLOW_DROP=true # Enable DROP/TRUNCATE
Multiple Transport Options
Connect via:
- stdio (for CLI-based AI assistants like Claude Code)
- HTTP (for web-based integrations)
- SSE (Server-Sent Events for real-time updates)
With optional JWT authentication for secure deployments.
Tutorial: Setting Up MCP-Manticore
MCP-Manticore works with any MCP-compatible AI assistant, including Cursor , Claude Code , Codex CLI , Windsurf , and any other tool that supports the Model Context Protocol.
Step 1: Ensure UV is Installed
MCP-Manticore runs best with uv , a fast Python package manager:
curl -LsSf https://astral.sh/uv/install.sh | sh
With uv, you don't need to manually install MCP-Manticore—uvx downloads and runs it automatically.
Step 2: Configure Environment Variables (Optional)
# Required: Manticore connection (defaults shown)
export MANTICORE_HOST=localhost
export MANTICORE_PORT=9308
# Optional: Enable write access (default: read-only)
export MANTICORE_ALLOW_WRITE_ACCESS=true
# Optional: Allow destructive operations (DROP, TRUNCATE)
export MANTICORE_ALLOW_DROP=false
Step 3: Add to Your MCP Client
General Configuration:
- Command:
uvx mcp-manticore - Environment variables (if needed):
MANTICORE_HOST,MANTICORE_PORT, etc.
Example configuration (mcp.json):
{
"mcpServers": {
"manticore": {
"command": "uvx",
"args": ["mcp-manticore"],
"env": {
"MANTICORE_HOST": "localhost",
"MANTICORE_PORT": "9308"
}
}
}
}
For client-specific setup instructions (Cursor, Claude Desktop, Windsurf, etc.), see the MCP-Manticore README .
Step 4: Verify Connection
Test by asking your AI assistant:
"Show me all tables in Manticore"
You should see the AI call the list_tables() tool and display your tables.
Configuration Reference
| Environment Variable | Description | Default |
|---|---|---|
MANTICORE_HOST | Manticore server hostname | localhost |
MANTICORE_PORT | Manticore HTTP port | 9308 |
MANTICORE_ALLOW_WRITE_ACCESS | Enable INSERT/UPDATE/DELETE | false |
MANTICORE_ALLOW_DROP | Enable DROP/TRUNCATE | false |
MANTICORE_MCP_TRANSPORT | Transport type (stdio/http/sse) | stdio |
MANTICORE_MCP_AUTH_TOKEN | JWT token for HTTP/SSE | - |
The Future: Agents That Install Themselves
There's a third use case on the horizon: autonomous agents that discover and install MCP servers themselves.
Imagine an AI agent that:
- Finds your GitHub repo mentioning Manticore
- Searches for "Manticore MCP server"
- Finds MCP-Manticore, installs it automatically
- Starts querying your database to complete its task
This isn't science fiction—OpenAI's Codex and similar agentic systems are moving in this direction. When that future arrives, having MCP-Manticore in the MCP registry means your AI tools will just work with Manticore, no manual setup required.
Conclusion
MCP-Manticore transforms AI assistants from passive text generators into active, knowledgeable development partners. Whether you're:
- Building with Manticore — Let the AI handle syntax while you focus on your application
- Learning Manticore — Ask questions in plain English, get accurate answers backed by docs
- Exploring your data — Query without memorizing SQL syntax or table schemas
The old way: guess, error, debug, repeat.
The new way: ask, verify, execute, done.
Ready to try it? With uv installed, just add MCP-Manticore to your MCP client settings and start asking. Your future self—free from syntax rabbit holes—will thank you.
Resources:
