
docs-mcp-server
This is a lightweight, plug-and-play MCP server that empowers LLMs like Claude or GPT to dynamically search and retrieve up-to-date documentation from popular AI libraries such as LangChain, LlamaIndex, and OpenAI.
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
📚 MCP Docs Search Server
A lightweight MCP server that searches and retrieves relevant documentation content from popular AI libraries like LangChain, LlamaIndex, and OpenAI using a combination of web search and content parsing.
This project allows Language Models to query and fetch up-to-date documentation content dynamically, acting as a bridge between LLMs and external doc sources.
Model Context Protocol (MCP)
The Model Context Protocol is an open standard that enables developers to build secure, two-way connections between their data sources and AI-powered tools. The architecture is straightforward: developers can either expose their data through MCP servers or build AI applications (MCP clients) that connect to these servers.
LLMs in Isolation
LLMs alone are limited — their true potential is unlocked when integrated with tools and services via frameworks like MCP.
-
LLMs without tools, LLMs are static and have limited utility.
-
With tools, they become interactive, but orchestration can be messy.
-
With MCP, LLMs gain a scalable, plug-and-play interface to real-world services, making them much more practical and powerful in production environments.
MCP Ecosystem
The MCP Server acts as the translator/interface between LLMs and services.
MCP (Modular Capability Provider) standardizes how LLMs interact with external tools/services — promoting interoperability, modularity, and cleaner interfaces.
This structure decentralizes responsibility:
-
Tool providers build and maintain their own MCP Server implementation.
-
LLMs just need to speak the MCP protocol.
Purpose and Vision:
-
Standardize communication between LLMs and external tools
-
Avoid bespoke integrations
-
Encourage a scalable ecosystem of services (like a plugin architecture)
🚀 Features
🔍 Web Search Integration Uses the Serper API to query Google and retrieve the top documentation pages related to a given search query.
🧹 Clean Content Extraction Parses HTML content using BeautifulSoup to extract clean, human-readable text—stripping away unnecessary tags, ads, or navigation content.
🤖 Seamless LLM Tooling Exposes a structured get_docs tool that can be used within LLM agents (e.g., Claude, GPT) to query specific libraries in real time.
🛠️Tool
get_docs(query: str, library: str)
This is the core tool provided by the MCP server. It accepts:
query: The search term or phrase.
library: One of langchain, llama-index, or openai.
Workflow
- 🔍 Searches for relevant documentation pages
- 📄 Fetches and parses clean text content
- 🧠 Sends the result back to the LLM for further reasoning and responses
📦 Setup
- Clone the repository
git clone https://github.com/your-username/mcp-docs-search.git
cd mcp-docs-search
- Create a virtual Envoirment using uv and activate it
uv venv .venv
.\.venv\Scripts\activate
- Install dependencies
uv add "mcp[cli]" httpx
uv pip install beautifulsoup4
- Set your environment variables Create a .env file and add your Serper API key:
SERPER_API_KEY=your_serper_api_key
🧩 Claude Desktop Integration
To integrate this server as a tool within Claude Desktop:
Open Claude Desktop → File > Settings > Developer > Edit Config.
Update your claude_desktop_config.json to include the following:
{
"mcpServers": {
"documnetation": {
"command": "uv",
"args": [
"--directory",
"your_reository_where_the_repo_exists",
"run",
"main.py"
]
}
}
}
🔁 Important: Restart Claude Desktop after saving the config to load the new to
Once integrated successfully, you'll see your custom MCP tool appear within the Claude UI:
Use it to query docs in real time:
🪲Debugging in Real Time
One can also debug the tool that we created using the following command.
Remember to install NodeJs18+
npx @modelcontextprotocol/inspector uv run main.py
and follow to the port where the connection is setup.
🧰 Supported Libraries / Docs
More libraries can be easily added by updating the docs_urls dictionary.
🧠 Future Enhancements
-
✅ Add support for additional libraries like HuggingFace, PyTorch, TensorFlow, etc.
-
⚡ Implement caching to reduce redundant fetches and improve performance.
-
📈 Introduce a scoring/ranking mechanism based on relevance or token quality.
-
🧪 Unit testing and better exception handling for production readiness.
相关推荐
I find academic articles and books for research and literature reviews.
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
💬 MaxKB is a ready-to-use AI chatbot that integrates Retrieval-Augmented Generation (RAG) pipelines, supports robust workflows, and provides advanced MCP tool-use capabilities.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
MCP server to provide Figma layout information to AI coding agents like Cursor
Python code to use the MCP3008 analog to digital converter with a Raspberry Pi or BeagleBone black.
Reviews

user_9gTWvbYY
I've been using the docs-mcp-server by RohitKrish46, and it has greatly improved my document management workflow. The server is reliable, easy to set up, and integrates seamlessly with other tools. Highly recommend it for anyone looking to streamline their document processing tasks. Check it out on GitHub!