
mcp-engine
MCPEngine is a client, server, and proxy implementation of model context protocol (MCP) specifically oriented towards Enterprise and real-world remote MCP applications.
3 years
Works with Finder
5
Github Watches
7
Github Forks
65
Github Stars
MCPEngine
Production-Grade Implementation of the Model Context Protocol (MCP)

Overview
MCPEngine is a production-grade, HTTP-first implementation of the Model Context Protocol (MCP). It provides a secure, scalable, and modern framework for exposing data, tools, and prompts to Large Language Models (LLMs) via MCP.
We believe MCP can be the "REST for LLMs," enabling any application (Slack, Gmail, GitHub, etc.) to expose a standardized endpoint that LLMs can access without custom-coded integrations. MCPEngine is our contribution to making MCP robust enough for modern, cloud-native use cases.
Key Features
- Built-in OAuth with Okta, Keycloak, Google SSO, etc.
- HTTP-first design (SSE instead of just stdio)
- Scope-based Authorization for tools, resources, and prompts
- Seamless bridging for LLM hosts (like Claude Desktop) via a local proxy
- Full backwards-compatibility with FastMCP and the official MCP SDK
Architecture
MCPEngine uses a proxy-based architecture to integrate with LLM hosts like Claude Desktop:
┌───────────────┐ stdio ┌─────────────────┐ HTTP/SSE ┌───────────────┐
│ Claude Host ├───────────────► MCPProxy Local ├──────────────────► MCPEngine │
│ │ │ │ │ Server │
│ ◄───────────────┤ (runs locally) ◄──────────────────┬┤ (remote) │
└───────────────┘ └─────────────────┘ OAuth 2.1 │└───────────────┘
│
┌────────────┴───────────┐
│ Identity Provider │
│ (Okta, Keycloak, etc.) │
└────────────────────────┘
This architecture provides several advantages:
- Seamless integration - Claude sees a local stdio-based process
- Security - The proxy handles OAuth authentication flows
- Scalability - The MCPEngine server can run anywhere (cloud, on-prem)
- Separation of concerns - Authentication is handled independently from your business logic
Installation
uv add "mcpengine[cli]"
# or
pip install "mcpengine[cli]"
Once installed, you can run the CLI tools:
mcpengine --help
Quickstart
Create a Server
# server.py
from mcpengine import MCPEngine
mcp = MCPEngine("Demo")
@mcp.tool()
def add(a: int, b: int) -> int:
return a + b
@mcp.resource("greeting://{name}")
def get_greeting(name: str) -> str:
return f"Hello, {name}!"
Claude Desktop Integration
If your server is at http://localhost:8000, you can start the proxy locally:
mcpengine proxy http://localhost:8000/sse
Claude Desktop sees a local stdio server, while the proxy handles any necessary OAuth or SSE traffic automatically.
Core Concepts
Authentication & Authorization
Enable OAuth and scopes:
from mcpengine import MCPEngine, Context
mcp = MCPEngine(
"SecureDemo",
authentication_enabled=True,
issuer_url="https://your-idp.example.com/realms/some-realm",
)
@mcp.auth(scopes=["calc:read"])
@mcp.tool()
def add(a: int, b: int, ctx: Context) -> int:
ctx.info(f"User {ctx.user_id} with roles {ctx.roles} called add.")
return a + b
Any attempt to call add
requires the user to have calc:read
scope. Without it, the server returns 401 Unauthorized, prompting a login flow if used via the proxy.
Resources
@mcp.resource("uri")
: Provide read-only context for LLMs, like a GET endpoint.
from mcpengine import MCPEngine
mcp = MCPEngine("Demo")
@mcp.resource("config://app")
def get_config() -> str:
return "Configuration Data"
Tools
@mcp.tool()
: LLM-invokable functions. They can have side effects or perform computations.
from mcpengine import MCPEngine
mcp = MCPEngine("Demo")
@mcp.tool()
def send_email(to: str, body: str):
return "Email Sent!"
Prompts
@mcp.prompt()
: Reusable conversation templates.
from mcpengine import MCPEngine
mcp = MCPEngine("Demo")
@mcp.prompt()
def debug_prompt(error_msg: str):
return f"Debug: {error_msg}"
Images
Return images as first-class data:
from mcpengine import MCPEngine, Image
mcp = MCPEngine("Demo")
@mcp.tool()
def thumbnail(path: str) -> Image:
# ... function body omitted
pass
Context
Each request has a Context:
-
ctx.user_id
: Authenticated user id -
ctx.user_name
: Authenticated user name -
ctx.roles
: User scopes/roles -
ctx.info(...)
: Logging -
ctx.read_resource(...)
: Access other resources
Example Implementations
SQLite Explorer
import sqlite3
from mcpengine import MCPEngine, Context
mcp = MCPEngine(
"SQLiteExplorer",
authentication_enabled=True,
issuer_url="https://your-idp.example.com/realms/some-realm",
)
@mcp.auth(scopes=["database:read"])
@mcp.tool()
def query_db(sql: str, ctx: Context) -> str:
conn = sqlite3.connect("data.db")
try:
rows = conn.execute(sql).fetchall()
ctx.info(f"User {ctx.user.id} executed query: {sql}")
return str(rows)
except Exception as e:
return f"Error: {str(e)}"
Echo Server
from mcpengine import MCPEngine
mcp = MCPEngine("Demo")
@mcp.resource("echo://{msg}")
def echo_resource(msg: str):
return f"Resource echo: {msg}"
@mcp.tool()
def echo_tool(msg: str):
return f"Tool echo: {msg}"
Smack - Message Storage Example

Smack is a simple messaging service example with PostgreSQL storage that demonstrates MCPEngine's capabilities with OAuth 2.1 authentication.
Quick Start
- Start the service using Docker Compose:
git clone https://github.com/featureform/mcp-engine.git
cd mcp-engine/examples/servers/smack
docker-compose up --build
- Using Claude Desktop
Configure Claude Desktop to use Smack:
Manually:
touch ~/Library/Application\ Support/Claude/claude_desktop_config.json
Add to the file:
{
"mcpServers": {
"smack_mcp_server": {
"command": "bash",
"args": [
"docker attach mcpengine_proxy || docker run --rm -i --net=host --name mcpengine_proxy featureformcom/mcpengine-proxy -host=http://localhost:8000 -debug -client_id=optional -client_secret=optional",
]
}
}
}
Via CLI:
mcpengine proxy http://localhost:8000
Smack provides two main tools:
-
list_messages()
: Retrieves all messages -
post_message(message: str)
: Posts a new message
For more details, see the Smack example code.
Roadmap
- Advanced Auth Flows
- Service Discovery
- Fine-Grained Authorization
- Observability & Telemetry
- Ongoing FastMCP Compatibility
Contributing
We welcome feedback, issues, and pull requests. If you'd like to shape MCP's future, open an issue or propose changes on GitHub. We actively maintain MCPEngine to align with real-world enterprise needs.
Community
Join our discussion on Slack to share feedback, propose features, or collaborate.
License
Licensed under the MIT License. See LICENSE for details.
相关推荐
I find academic articles and books for research and literature reviews.
Converts Figma frames into front-end code for various mobile frameworks.
Confidential guide on numerology and astrology, based of GG33 Public information
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Advanced software engineer GPT that excels through nailing the basics.
Delivers concise Python code and interprets non-English comments
💬 MaxKB is a ready-to-use AI chatbot that integrates Retrieval-Augmented Generation (RAG) pipelines, supports robust workflows, and provides advanced MCP tool-use capabilities.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
MCP server to provide Figma layout information to AI coding agents like Cursor
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Python code to use the MCP3008 analog to digital converter with a Raspberry Pi or BeagleBone black.
Reviews

user_o9mvFRyJ
I've been using mcp-engine by featureform for a while now, and it has significantly improved my workflow. The ease of integration and robust features make it a standout in its category. The community and support provided are top-notch – a must-have tool for any developer looking to streamline their processes! Check it out here: https://github.com/featureform/mcp-engine.