
context7-mcp
Context7 MCP Server
3 years
Works with Finder
5
Github Watches
9
Github Forks
161
Github Stars
Context7 MCP - Up-to-date Docs For Any Cursor Prompt
❌ Without Context7
LLMs rely on outdated or generic information about the libraries you use. You get:
- ❌ Code examples are outdated and based on year-old training data
- ❌ Hallucinated APIs don't even exist
- ❌ Generic answers for old package versions
✅ With Context7
Context7 MCP pulls up-to-date, version-specific documentation and code examples straight from the source — and places them directly into your prompt.
Add use context7
to your question in Cursor:
How do I use the new Next.js `after` function? use context7
How do I invalidate a query in React Query? use context7
How do I protect a route with NextAuth? use context7
Context7 fetches up-to-date documentation and working code examples right into your LLM’s context.
- 1️⃣ Ask your question naturally
- 2️⃣ Tell the LLM to
use context7
- 3️⃣ Get working code answers
No tab-switching, no hallucinated APIs that don't exist, no outdated code generations.
🛠️ Getting Started
Requirements
- Node.js >= v18.0.0
- Cursor, Windsurf, Claude Desktop or another MCP Client
Install in Cursor
Go to: Settings
-> Cursor Settings
-> MCP
-> Add new global MCP server
Paste this into your Cursor ~/.cursor/mcp.json
file. See Cursor MCP docs for more info.
{
"mcpServers": {
"context7": {
"command": "npx",
"args": ["-y", "@upstash/context7-mcp"]
}
}
}
Install in Windsurf
Add this to your Windsurf MCP config file. See Windsurf MCP docs for more info.
{
"mcpServers": {
"context7": {
"command": "npx",
"args": ["-y", "@upstash/context7-mcp"]
}
}
}
Install in VSCode
Add this to your VSCode MCP config file. See VSCode MCP docs for more info.
{
"servers": {
"Context7": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@upstash/context7-mcp"]
}
}
}
Available Tools
-
resolve-library-id
: Resolves a general library name into a Context7-compatible library ID.-
libraryName
(optional): Search and rerank results
-
-
get-library-docs
: Fetches documentation for a library using a Context7-compatible library ID.-
context7CompatibleLibraryID
(required) -
topic
(optional): Focus the docs on a specific topic (e.g., "routing", "hooks") -
tokens
(optional, default 5000): Max number of tokens to return
-
Development
Clone the project and install dependencies:
bun i
Build:
bun run build
Local Configuration Example
{
"mcpServers": {
"context7": {
"command": "npx",
"args": ["tsx", "/path/to/folder/context7-mcp/src/index.ts"]
}
}
}
Testing with MCP Inspector
npx -y @modelcontextprotocol/inspector npx @upstash/context7-mcp
License
MIT
相关推荐
I find academic articles and books for research and literature reviews.
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
💬 MaxKB is a ready-to-use AI chatbot that integrates Retrieval-Augmented Generation (RAG) pipelines, supports robust workflows, and provides advanced MCP tool-use capabilities.
Python code to use the MCP3008 analog to digital converter with a Raspberry Pi or BeagleBone black.
MCP server to provide Figma layout information to AI coding agents like Cursor
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Put an end to hallucinations! GitMCP is a free, open-source, remote MCP server for any GitHub project
Reviews

user_NwHtLUSn
As a dedicated user of context7-mcp by upstash, I must say this tool is a game-changer! The integration is seamless and it has significantly streamlined my workflow. The detailed documentation and user-friendly interface made it easy to get started right away. Highly recommend for anyone looking to enhance their development process! Check it out on their GitHub.