Cover image
麦克波斯特
Private

麦克波斯特

Try Now
4 天前

CLI主机应用程序,使大型语言模型(LLMS)能够通过模型上下文协议(MCP)与外部工具进行交互。

3 years

Works with Finder

7

Github Watches

90

Github Forks

579

Github Stars

MCPHost 🤖

A CLI host application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP). Currently supports both Claude 3.5 Sonnet and Ollama models.

Discuss the Project on Discord

Overview 🌟

MCPHost acts as a host in the MCP client-server architecture, where:

  • Hosts (like MCPHost) are LLM applications that manage connections and interactions
  • Clients maintain 1:1 connections with MCP servers
  • Servers provide context, tools, and capabilities to the LLMs

This architecture allows language models to:

  • Access external tools and data sources 🛠️
  • Maintain consistent context across interactions 🔄
  • Execute commands and retrieve information safely 🔒

Currently supports:

  • Claude 3.5 Sonnet (claude-3-5-sonnet-20240620)
  • Any Ollama-compatible model with function calling support
  • Google Gemini models
  • Any OpenAI-compatible local or online model with function calling support

Features ✨

  • Interactive conversations with support models
  • Support for multiple concurrent MCP servers
  • Dynamic tool discovery and integration
  • Tool calling capabilities for both model types
  • Configurable MCP server locations and arguments
  • Consistent command interface across model types
  • Configurable message history window for context management

Requirements 📋

  • Go 1.23 or later
  • For Claude: An Anthropic API key
  • For Ollama: Local Ollama installation with desired models
  • For Google/Gemini: Google API key (see https://aistudio.google.com/app/apikey)
  • One or more MCP-compatible tool servers

Environment Setup 🔧

  1. Anthropic API Key (for Claude):
export ANTHROPIC_API_KEY='your-api-key'
  1. Ollama Setup:
ollama pull mistral
  • Ensure Ollama is running:
ollama serve

You can also configure the Ollama client using standard environment variables, such as OLLAMA HOST for the Ollama base URL.

  1. Google API Key (for Gemini):
export GOOGLE_API_KEY='your-api-key'
  1. OpenAI compatible online Setup
  • Get your api server base url, api key and model name

Installation 📦

go install github.com/mark3labs/mcphost@latest

Configuration ⚙️

MCPHost will automatically create a configuration file at ~/.mcp.json if it doesn't exist. You can also specify a custom location using the --config flag:

{
  "mcpServers": {
    "sqlite": {
      "command": "uvx",
      "args": [
        "mcp-server-sqlite",
        "--db-path",
        "/tmp/foo.db"
      ]
    },
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/tmp"
      ]
    }
  }
}

Each MCP server entry requires:

  • command: The command to run (e.g., uvx, npx)
  • args: Array of arguments for the command:
    • For SQLite server: mcp-server-sqlite with database path
    • For filesystem server: @modelcontextprotocol/server-filesystem with directory path

Usage 🚀

MCPHost is a CLI tool that allows you to interact with various AI models through a unified interface. It supports various tools through MCP servers.

Available Models

Models can be specified using the --model (-m) flag:

  • Anthropic Claude (default): anthropic:claude-3-5-sonnet-latest
  • OpenAI or OpenAI-compatible: openai:gpt-4
  • Ollama models: ollama:modelname
  • Google: google:gemini-2.0-flash

Examples

# Use Ollama with Qwen model
mcphost -m ollama:qwen2.5:3b

# Use OpenAI's GPT-4
mcphost -m openai:gpt-4

# Use OpenAI-compatible model
mcphost --model openai:<your-model-name> \
--openai-url <your-base-url> \
--openai-api-key <your-api-key>

Flags

  • --anthropic-url string: Base URL for Anthropic API (defaults to api.anthropic.com)
  • --anthropic-api-key string: Anthropic API key (can also be set via ANTHROPIC_API_KEY environment variable)
  • --config string: Config file location (default is $HOME/.mcp.json)
  • --debug: Enable debug logging
  • --message-window int: Number of messages to keep in context (default: 10)
  • -m, --model string: Model to use (format: provider:model) (default "anthropic:claude-3-5-sonnet-latest")
  • --openai-url string: Base URL for OpenAI API (defaults to api.openai.com)
  • --openai-api-key string: OpenAI API key (can also be set via OPENAI_API_KEY environment variable)
  • --google-api-key string: Google API key (can also be set via GOOGLE_API_KEY environment variable)

Interactive Commands

While chatting, you can use:

  • /help: Show available commands
  • /tools: List all available tools
  • /servers: List configured MCP servers
  • /history: Display conversation history
  • /quit: Exit the application
  • Ctrl+C: Exit at any time

Global Flags

  • --config: Specify custom config file location
  • --message-window: Set number of messages to keep in context (default: 10)

MCP Server Compatibility 🔌

MCPHost can work with any MCP-compliant server. For examples and reference implementations, see the MCP Servers Repository.

Contributing 🤝

Contributions are welcome! Feel free to:

  • Submit bug reports or feature requests through issues
  • Create pull requests for improvements
  • Share your custom MCP servers
  • Improve documentation

Please ensure your contributions follow good coding practices and include appropriate tests.

License 📄

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments 🙏

  • Thanks to the Anthropic team for Claude and the MCP specification
  • Thanks to the Ollama team for their local LLM runtime
  • Thanks to all contributors who have helped improve this tool

相关推荐

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • Beria Joey
  • 你的职业规划师,不走弯路就问我。Sponsor:小红书“ ItsJoe就出行 ”

  • Benedikt Ess
  • FindetundanalysiertOnlineProdukteeinschlielichAmazonnachVolumenBewertungenundPreis

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • GeyserMC
  • 与Minecraft客户端/服务器通信的库。

  • 1Panel-dev
  • 🔥1Panel提供了直观的Web接口和MCP服务器,用于在Linux服务器上管理网站,文件,容器,数据库和LLMS。

  • awslabs
  • AWS MCP服务器 - 将AWS最佳实践直接带入您的开发工作流程的专门MCP服务器

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • appcypher
  • 很棒的MCP服务器 - 模型上下文协议服务器的策划列表

  • GLips
  • MCP服务器向像光标这样的AI编码代理提供FIGMA布局信息

  • Byaidu
  • PDF科学纸翻译带有保留格式的pdf -基于ai完整保留排版的pdf文档全文双语翻译

  • n8n-io
  • 具有本机AI功能的公平代码工作流程自动化平台。将视觉构建与自定义代码,自宿主或云相结合,400+集成。

  • activepieces
  • AI代理和MCPS&AI工作流程自动化•(AI代理280+ MCP服务器)•AI Automation / MCPS的AI Automation / AI Agent•AI Workfrows&AI代理•AI代理的MCPS

    Reviews

    2 (1)
    Avatar
    user_dbTKR4ft
    2025-04-17

    I've been using mcphost and absolutely love it! It's a reliable and efficient tool for managing my applications. The user interface is intuitive, making it easy to navigate and utilize its powerful features. Mark3labs has done an excellent job developing this product. Highly recommend checking it out at https://github.com/mark3labs/mcphost.