Cover image

简单,安全的MCP-TO-OPENAPI代理服务器

3 years

Works with Finder

14

Github Watches

103

Github Forks

1.1k

Github Stars

⚡️ mcpo

Expose any MCP tool as an OpenAPI-compatible HTTP server—instantly.

mcpo is a dead-simple proxy that takes an MCP server command and makes it accessible via standard RESTful OpenAPI, so your tools "just work" with LLM agents and apps expecting OpenAPI servers.

No custom protocol. No glue code. No hassle.

🤔 Why Use mcpo Instead of Native MCP?

MCP servers usually speak over raw stdio, which is:

  • 🔓 Inherently insecure
  • ❌ Incompatible with most tools
  • 🧩 Missing standard features like docs, auth, error handling, etc.

mcpo solves all of that—without extra effort:

  • ✅ Works instantly with OpenAPI tools, SDKs, and UIs
  • 🛡 Adds security, stability, and scalability using trusted web standards
  • 🧠 Auto-generates interactive docs for every tool, no config needed
  • 🔌 Uses pure HTTP—no sockets, no glue code, no surprises

What feels like "one more step" is really fewer steps with better outcomes.

mcpo makes your AI tools usable, secure, and interoperable—right now, with zero hassle.

🚀 Quick Usage

We recommend using uv for lightning-fast startup and zero config.

uvx mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command

Or, if you’re using Python:

pip install mcpo
mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command

To use an SSE-compatible MCP server, simply specify the server type and endpoint:

mcpo --port 8000 --api-key "top-secret" --server-type "sse" -- http://127.0.0.1:8001/sse

You can also run mcpo via Docker with no installation:

docker run -p 8000:8000 ghcr.io/open-webui/mcpo:main --api-key "top-secret" -- your_mcp_server_command

Example:

uvx mcpo --port 8000 --api-key "top-secret" -- uvx mcp-server-time --local-timezone=America/New_York

That’s it. Your MCP tool is now available at http://localhost:8000 with a generated OpenAPI schema — test it live at http://localhost:8000/docs.

🤝 To integrate with Open WebUI after launching the server, check our docs.

🔄 Using a Config File

You can serve multiple MCP tools via a single config file that follows the Claude Desktop format:

Start via:

mcpo --config /path/to/config.json

Example config.json:

{
  "mcpServers": {
    "memory": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-memory"]
    },
    "time": {
      "command": "uvx",
      "args": ["mcp-server-time", "--local-timezone=America/New_York"]
    },
    "mcp_sse": {
      "url": "http://127.0.0.1:8001/sse"
    } // SSE MCP Server
  }
}

Each tool will be accessible under its own unique route, e.g.:

Each with a dedicated OpenAPI schema and proxy handler. Access full schema UI at: http://localhost:8000/<tool>/docs (e.g. /memory/docs, /time/docs)

🔧 Requirements

  • Python 3.8+
  • uv (optional, but highly recommended for performance + packaging)

🛠️ Development & Testing

To contribute or run tests locally:

  1. Set up the environment:

    # Clone the repository
    git clone https://github.com/open-webui/mcpo.git
    cd mcpo
    
    # Install dependencies (including dev dependencies)
    uv sync --dev
    
  2. Run tests:

    uv run pytest
    

🪪 License

MIT

🤝 Contributing

We welcome and strongly encourage contributions from the community!

Whether you're fixing a bug, adding features, improving documentation, or just sharing ideas—your input is incredibly valuable and helps make mcpo better for everyone.

Getting started is easy:

  • Fork the repo
  • Create a new branch
  • Make your changes
  • Open a pull request

Not sure where to start? Feel free to open an issue or ask a question—we’re happy to help you find a good first task.

✨ Star History

Star History Chart

✨ Let's build the future of interoperable AI tooling together!

相关推荐

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • Benedikt Ess
  • FindetundanalysiertOnlineProdukteeinschlielichAmazonnachVolumenBewertungenundPreis

  • Beria Joey
  • 跟在天堂的亲人对话。Talk to loved ones in heaven. Sponsor:小红书“ ItsJoe就出行 ”

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • GeyserMC
  • 与Minecraft客户端/服务器通信的库。

  • 1Panel-dev
  • 🔥1Panel提供了直观的Web接口和MCP服务器,用于在Linux服务器上管理网站,文件,容器,数据库和LLMS。

  • awslabs
  • AWS MCP服务器 - 将AWS最佳实践直接带入您的开发工作流程的专门MCP服务器

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • appcypher
  • 很棒的MCP服务器 - 模型上下文协议服务器的策划列表

  • GLips
  • MCP服务器向像光标这样的AI编码代理提供FIGMA布局信息

  • Byaidu
  • PDF科学纸翻译带有保留格式的pdf -基于ai完整保留排版的pdf文档全文双语翻译

  • n8n-io
  • 具有本机AI功能的公平代码工作流程自动化平台。将视觉构建与自定义代码,自宿主或云相结合,400+集成。

  • activepieces
  • AI代理和MCPS&AI工作流程自动化•(AI代理280+ MCP服务器)•AI Automation / MCPS的AI Automation / AI Agent•AI Workfrows&AI代理•AI代理的MCPS

    Reviews

    2 (1)
    Avatar
    user_LTgBWosJ
    2025-04-17

    I've been using mcpo for a while now and I must say it's an incredibly efficient tool developed by open-webui. It's user-friendly and integrates seamlessly with my workflow, saving me a lot of time. I'd definitely recommend checking it out! For more information, visit: https://github.com/open-webui/mcpo.