I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

langchain-mcp
LangChain Agent with MCP Servers: Using LangChain MCP Adapters for tool integration.
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
LangChain Agent with MCP Servers
A LangChain agent using MCP Adapters for tool integration with Model Context Protocol (MCP) servers.
Overview
This project demonstrates how to build a LangChain agent that uses the Model Context Protocol (MCP) to interact with various services:
- Tavily Search: Web search and news search capabilities
- Weather: Mock weather information retrieval
- Math: Mathematical expression evaluation
The agent uses LangGraph's ReAct agent pattern to dynamically select and use these tools based on user queries.
Features
- Graceful Shutdown: All MCP servers implement proper signal handling for clean termination
- Subprocess Management: The main agent tracks and manages all MCP server subprocesses
- Error Handling: Robust error handling throughout the application
- Modular Design: Easy to extend with additional MCP servers
Graceful Shutdown Mechanism
This project implements a comprehensive graceful shutdown system:
- Signal Handling: Captures SIGINT and SIGTERM signals to initiate graceful shutdown
- Process Tracking: The main agent maintains a registry of all child processes
- Cleanup Process: Ensures all subprocesses are properly terminated on exit
- Shutdown Flags: Each MCP server has a shutdown flag to prevent new operations when shutdown is initiated
- Async Cooperation: Uses asyncio to allow operations in progress to complete when possible
Installation
# Clone the repository
git clone https://github.com/yourusername/langchain-mcp.git
cd langchain-mcp
# Create a virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies
pip install -e .
Configuration
Create a .env
file in the project root with the following variables:
OPENAI_API_KEY=your_openai_api_key
TAVILY_API_KEY=your_tavily_api_key
Usage
Run the agent from the command line:
python src/agent.py
The agent will prompt for your query and then process it using the appropriate tools.
Development
To add a new MCP server:
- Create a new file in
src/mcpserver/
- Implement the server with proper signal handling
- Update
src/mcpserver/__init__.py
to expose the new server - Add the server configuration to
src/agent.py
License
MIT
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Advanced software engineer GPT that excels through nailing the basics.
💬 MaxKB is an open-source AI assistant for enterprise. It seamlessly integrates RAG pipelines, supports robust workflows, and provides MCP tool-use capabilities.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
MCP server to provide Figma layout information to AI coding agents like Cursor
Reviews

user_4EuXQgMW
As an avid user, I must say MCP_Server_Spotify by Hashim9184 is phenomenal. The integration with Spotify is seamless, making music streaming delightful. The server's performance and reliability are top-notch, ensuring an uninterrupted musical experience. Highly recommend checking it out: https://mcp.so/server/MCP_Server_Spotify/Hashim9184.