
mcp_starter
How to setup mcp server and mcp client.
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
MCP Starter Project
What is MCP?
The Model Context Protocol (MCP) is a standard for building AI applications that can interact with external tools and APIs. It consists of two main components:
- MCP Server: A Python service that defines and exposes tools/functions that can be called by AI models
- MCP Client: A TypeScript/JavaScript client that connects to the MCP server and manages interactions between AI models and tools
Project Structure
mcp_starter/
├── mcp-server/ # Python MCP server implementation
│ ├── main.py # Server with documentation search tool
│ └── pyproject.toml # Python dependencies
└── mcp-clients/ # TypeScript MCP client implementation
├── index.ts # Express server with HuggingFace integration
└── package.json # Node.js dependencies
Getting Started
Prerequisites
- Python 3.11 or higher
- Node.js 18 or higher
- Hugging Face API key
- Serper API key for Google Search functionality
Setting Up the Server
- Create a Python virtual environment and activate it:
cd mcp-server
python -m venv .venv
# On Windows
.venv\Scripts\activate
- Install dependencies:
pip install -e .
- Create a
.env
file in themcp-server
directory:
SERPER_API_KEY=your_serper_api_key_here
Setting Up the Client
- Install Node.js dependencies:
cd mcp-clients
npm install
- Create a
.env
file in themcp-clients
directory:
HUGGINGFACE_API_KEY=your_huggingface_api_key_here
- Build the TypeScript code:
npm run build
Running the Application
- Start the MCP server:
cd mcp-server
python main.py
- In a new terminal, start the client server:
cd mcp-clients
node build/index.js ../mcp-server/main.py
Using the API
The client exposes two endpoints:
-
Health Check:
GET http://localhost:3000/health
-
Chat:
POST http://localhost:3000/chat
Example chat request:
{
"query": "Search the langchain docs for RAG",
"sessionId": "user123"
}
Features
-
Documentation Search Tool: Search documentation for popular AI libraries:
- LangChain
- LlamaIndex
- OpenAI
-
Conversation Management: Maintains chat history per session
-
Tool Integration: Seamlessly integrates AI model responses with tool calls
-
Error Handling: Robust error handling for API calls and tool execution
How It Works
- The MCP server defines tools that can be called by AI models
- The client connects to the MCP server and retrieves available tools
- When a user sends a query:
- The client formats the conversation history
- Sends it to the Hugging Face model
- Extracts and executes tool calls from the model's response
- Returns the final response including tool results
Environment Variables
Server
-
SERPER_API_KEY
: API key for Google Search functionality
Client
-
HUGGINGFACE_API_KEY
: API key for accessing Hugging Face models
License
MIT License
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Advanced software engineer GPT that excels through nailing the basics.
Delivers concise Python code and interprets non-English comments
💬 MaxKB is a ready-to-use AI chatbot that integrates Retrieval-Augmented Generation (RAG) pipelines, supports robust workflows, and provides advanced MCP tool-use capabilities.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
MCP server to provide Figma layout information to AI coding agents like Cursor
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Reviews

user_0vwhohnW
As a loyal user of mcp_starter, I highly recommend this fantastic tool! It's incredibly easy to set up and has significantly improved my development workflows. Kudos to sharmatriloknath for creating such a valuable resource. For anyone looking to kickstart their projects efficiently, make sure to check it out at https://github.com/sharmatriloknath/mcp_starter!