Converts Figma frames into front-end code for various mobile frameworks.

mcp-everything
MCP server that exercises all the features of the MCP protocol
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
MCP Everything
Note: This project was extracted from https://github.com/modelcontextprotocol/servers/tree/main/src/everything to create a standalone implementation.
This MCP server attempts to exercise all the features of the MCP protocol. It is not intended to be a useful server, but rather a test server for builders of MCP clients. It implements prompts, tools, resources, sampling, and more to showcase MCP capabilities.
Installation
Local Installation
# Clone the repository
git clone https://github.com/modelcontextprotocol/mcp-everything.git
cd mcp-everything
# Install dependencies
npm install
# Build the project
npm run build
# Start the server
npm start
Global Installation
# Install globally from npm
npm install -g mcp-everything
# Run the server
mcp-everything
Docker
# Build the Docker image
docker build -t mcp-everything .
# Run the container
docker run -it mcp-everything
Usage with Claude Desktop
Add to your claude_desktop_config.json
:
{
"mcpServers": {
"everything": {
"command": "npx",
"args": [
"-y",
"mcp-everything"
]
}
}
}
Components
Tools
-
echo
- Simple tool to echo back input messages
- Input:
-
message
(string): Message to echo back
-
- Returns: Text content with echoed message
-
add
- Adds two numbers together
- Inputs:
-
a
(number): First number -
b
(number): Second number
-
- Returns: Text result of the addition
-
longRunningOperation
- Demonstrates progress notifications for long operations
- Inputs:
-
duration
(number, default: 10): Duration in seconds -
steps
(number, default: 5): Number of progress steps
-
- Returns: Completion message with duration and steps
- Sends progress notifications during execution
-
sampleLLM
- Demonstrates LLM sampling capability using MCP sampling feature
- Inputs:
-
prompt
(string): The prompt to send to the LLM -
maxTokens
(number, default: 100): Maximum tokens to generate
-
- Returns: Generated LLM response
-
getTinyImage
- Returns a small test image
- No inputs required
- Returns: Base64 encoded PNG image data
-
printEnv
- Prints all environment variables
- Useful for debugging MCP server configuration
- No inputs required
- Returns: JSON string of all environment variables
-
annotatedMessage
- Demonstrates how annotations can be used to provide metadata about content
- Inputs:
-
messageType
(enum: "error" | "success" | "debug"): Type of message to demonstrate different annotation patterns -
includeImage
(boolean, default: false): Whether to include an example image
-
- Returns: Content with varying annotations
Resources
The server provides 100 test resources in two formats:
-
Even numbered resources:
- Plaintext format
- URI pattern:
test://static/resource/{even_number}
- Content: Simple text description
-
Odd numbered resources:
- Binary blob format
- URI pattern:
test://static/resource/{odd_number}
- Content: Base64 encoded binary data
Resource features:
- Supports pagination (10 items per page)
- Allows subscribing to resource updates
- Demonstrates resource templates
- Auto-updates subscribed resources every 5 seconds
Prompts
-
simple_prompt
- Basic prompt without arguments
- Returns: Single message exchange
-
complex_prompt
- Advanced prompt demonstrating argument handling
- Required arguments:
-
temperature
(number): Temperature setting
-
- Optional arguments:
-
style
(string): Output style preference
-
- Returns: Multi-turn conversation with images
Logging
The server sends random-leveled log messages every 15 seconds to demonstrate the logging capabilities of MCP.
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Delivers concise Python code and interprets non-English comments
Advanced software engineer GPT that excels through nailing the basics.
💬 MaxKB is an open-source AI assistant for enterprise. It seamlessly integrates RAG pipelines, supports robust workflows, and provides MCP tool-use capabilities.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
MCP server to provide Figma layout information to AI coding agents like Cursor
Reviews

user_jOcvxLJ4
mcp-everything is an incredibly versatile and comprehensive tool developed by s2005. It offers a seamless integration for various applications and functions flawlessly. I highly recommend checking out their GitHub page for further details and updates. This tool truly covers all bases and is essential for any developer's toolkit.