Cover image
Try Now
2025-04-14

具有模型上下文协议(MCP)集成的Langgraph-Power React代理。简化的Web界面,用于动态配置,部署和与能够通过MCP工具访问各种数据源和API的AI代理进行交互。

3 years

Works with Finder

2

Github Watches

125

Github Forks

257

Github Stars

LangGraph Agents + MCP

English Korean

GitHub License Python Version

project demo

Project Overview

project architecture

LangChain-MCP-Adapters is a toolkit provided by LangChain AI that enables AI agents to interact with external tools and data sources through the Model Context Protocol (MCP). This project provides a user-friendly interface for deploying ReAct agents that can access various data sources and APIs through MCP tools.

Features

  • Streamlit Interface: A user-friendly web interface for interacting with LangGraph ReAct Agent with MCP tools
  • Tool Management: Add, remove, and configure MCP tools through the UI (Smithery JSON format supported). This is done dynamically without restarting the application
  • Streaming Responses: View agent responses and tool calls in real-time
  • Conversation History: Track and manage conversations with the agent

MCP Architecture

The Model Context Protocol (MCP) consists of three main components:

  1. MCP Host: Programs seeking to access data through MCP, such as Claude Desktop, IDEs, or LangChain/LangGraph.

  2. MCP Client: A protocol client that maintains a 1:1 connection with the server, acting as an intermediary between the host and server.

  3. MCP Server: A lightweight program that exposes specific functionalities through a standardized model context protocol, serving as the primary data source.

Quick Start with Docker

You can easily run this project using Docker without setting up a local Python environment.

Requirements (Docker Desktop)

Install Docker Desktop from the link below:

Run with Docker Compose

  1. Navigate to the dockers directory
cd dockers
  1. Create a .env file with your API keys in the project root directory.
cp .env.example .env

Enter your obtained API keys in the .env file.

(Note) Not all API keys are required. Only enter the ones you need.

  • ANTHROPIC_API_KEY: If you enter an Anthropic API key, you can use "claude-3-7-sonnet-latest", "claude-3-5-sonnet-latest", "claude-3-haiku-latest" models.
  • OPENAI_API_KEY: If you enter an OpenAI API key, you can use "gpt-4o", "gpt-4o-mini" models.
  • LANGSMITH_API_KEY: If you enter a LangSmith API key, you can use LangSmith tracing.
ANTHROPIC_API_KEY=your_anthropic_api_key
OPENAI_API_KEY=your_openai_api_key
LANGSMITH_API_KEY=your_langsmith_api_key
LANGSMITH_TRACING=true
LANGSMITH_ENDPOINT=https://api.smith.langchain.com
LANGSMITH_PROJECT=LangGraph-MCP-Agents

When using the login feature, set USE_LOGIN to true and enter USER_ID and USER_PASSWORD.

USE_LOGIN=true
USER_ID=admin
USER_PASSWORD=admin123

If you don't want to use the login feature, set USE_LOGIN to false.

USE_LOGIN=false
  1. Select the Docker Compose file that matches your system architecture.

AMD64/x86_64 Architecture (Intel/AMD Processors)

# Run container
docker compose -f docker-compose.yaml up -d

ARM64 Architecture (Apple Silicon M1/M2/M3/M4)

# Run container
docker compose -f docker-compose-mac.yaml up -d
  1. Access the application in your browser at http://localhost:8585

(Note)

  • If you need to modify ports or other settings, edit the docker-compose.yaml file before building.

Install Directly from Source Code

  1. Clone this repository
git clone https://github.com/teddynote-lab/langgraph-mcp-agents.git
cd langgraph-mcp-agents
  1. Create a virtual environment and install dependencies using uv
uv venv
uv pip install -r requirements.txt
source .venv/bin/activate  # For Windows: .venv\Scripts\activate
  1. Create a .env file with your API keys (copy from .env.example)
cp .env.example .env

Enter your obtained API keys in the .env file.

(Note) Not all API keys are required. Only enter the ones you need.

  • ANTHROPIC_API_KEY: If you enter an Anthropic API key, you can use "claude-3-7-sonnet-latest", "claude-3-5-sonnet-latest", "claude-3-haiku-latest" models.
  • OPENAI_API_KEY: If you enter an OpenAI API key, you can use "gpt-4o", "gpt-4o-mini" models.
  • LANGSMITH_API_KEY: If you enter a LangSmith API key, you can use LangSmith tracing.
ANTHROPIC_API_KEY=your_anthropic_api_key
OPENAI_API_KEY=your_openai_api_key
LANGSMITH_API_KEY=your_langsmith_api_key
LANGSMITH_TRACING=true
LANGSMITH_ENDPOINT=https://api.smith.langchain.com
LANGSMITH_PROJECT=LangGraph-MCP-Agents
  1. (New) Use the login/logout feature

When using the login feature, set USE_LOGIN to true and enter USER_ID and USER_PASSWORD.

USE_LOGIN=true
USER_ID=admin
USER_PASSWORD=admin123

If you don't want to use the login feature, set USE_LOGIN to false.

USE_LOGIN=false

Usage

  1. Start the Streamlit application.
streamlit run app.py
  1. The application will run in the browser and display the main interface.

  2. Use the sidebar to add and configure MCP tools

Visit Smithery to find useful MCP servers.

First, select the tool you want to use.

Click the COPY button in the JSON configuration on the right.

copy from Smithery

Paste the copied JSON string in the Tool JSON section.

tool json

Click the Add Tool button to add it to the "Registered Tools List" section.

Finally, click the "Apply" button to apply the changes to initialize the agent with the new tools.

tool json
  1. Check the agent's status.

check status

  1. Interact with the ReAct agent that utilizes the configured MCP tools by asking questions in the chat interface.

project demo

Hands-on Tutorial

For developers who want to learn more deeply about how MCP and LangGraph integration works, we provide a comprehensive Jupyter notebook tutorial:

This hands-on tutorial covers:

  1. MCP Client Setup - Learn how to configure and initialize the MultiServerMCPClient to connect to MCP servers
  2. Local MCP Server Integration - Connect to locally running MCP servers via SSE and Stdio methods
  3. RAG Integration - Access retriever tools using MCP for document retrieval capabilities
  4. Mixed Transport Methods - Combine different transport protocols (SSE and Stdio) in a single agent
  5. LangChain Tools + MCP - Integrate native LangChain tools alongside MCP tools

This tutorial provides practical examples with step-by-step explanations that help you understand how to build and integrate MCP tools into LangGraph agents.

License

MIT License

References

相关推荐

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • Beria Joey
  • 你的职业规划师,不走弯路就问我。Sponsor:小红书“ ItsJoe就出行 ”

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • GeyserMC
  • 与Minecraft客户端/服务器通信的库。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • 1Panel-dev
  • 🔥1Panel提供了直观的Web接口和MCP服务器,用于在Linux服务器上管理网站,文件,容器,数据库和LLMS。

  • awslabs
  • AWS MCP服务器 - 将AWS最佳实践直接带入您的开发工作流程的专门MCP服务器

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • appcypher
  • 很棒的MCP服务器 - 模型上下文协议服务器的策划列表

  • GLips
  • MCP服务器向像光标这样的AI编码代理提供FIGMA布局信息

  • idosal
  • 结束幻觉! GITMCP是任何GitHub项目的免费,开源的远程MCP服务器

    Reviews

    3 (1)
    Avatar
    user_cWdh9e2U
    2025-04-17

    Langgraph-mcp-agents by teddynote-lab is an impressive tool that has greatly improved my workflow. The integration is seamless, and the agents are highly responsive. The documentation on GitHub is thorough, making it easy to get started. I highly recommend it to anyone looking to enhance their MCP applications.