Converts Figma frames into front-end code for various mobile frameworks.

mcp-terminal
run and use mcp servers in your terminal
3 years
Works with Finder
1
Github Watches
2
Github Forks
8
Github Stars
MCP Terminal
A terminal-based interactive client for Model Context Protocol (MCP) servers.
Installation
npm install -g mcp-terminal
Features
- Connect to multiple MCP servers simultaneously
- Interactive terminal for sending messages to models
- Easy configuration management
- Support for both stdio and SSE transports
- Switch between connected servers
Configuration
Before using MCP CLI, you need to configure at least one server:
mcp-terminal configure
This will open your default editor with a configuration file where you can define MCP servers.
Example configuration:
{
"mcpServers": {
"local-sse": {
"command": "npx @anthropic-ai/mcp-server@latest",
"args": [],
"url": "http://localhost:8765/sse"
},
"local-stdio": {
"command": "npx @anthropic-ai/mcp-server@latest",
"args": ["--stdio"]
},
"shopify": {
"command": "npx",
"args": [
"shopify-mcp",
"--accessToken",
"your-shopify-access-token",
"--domain",
"your-store.myshopify.com"
]
}
}
}
Notice that servers can be configured with:
- Both
command
andurl
for servers that need to be started locally but use SSE transport - Just
command
for servers that use stdio transport - Just
url
for connecting to remote servers
Usage
Configure MCP servers
mcp-terminal configure
This will open your default editor to configure MCP servers.
Start MCP server
mcp-terminal start
This will start the configured MCP server. You can have multiple servers configured.
Interactive Chat with AI using MCP tools
mcp-terminal chat
This starts an interactive chat session with an AI model that can use MCP tools from your configured server. The LLM can interact with the MCP server tools to help answer your questions and perform actions.
You can specify which server to use:
mcp-terminal chat -s local-stdio
Server Types
The chat command supports two types of server configurations:
- URL-based servers - Servers with a URL configured will connect via HTTP/SSE
- Command-based servers - Servers with only a command will be started automatically and use stdio transport
Requirements
To use the chat feature, you need:
- An OpenAI API key (set as
OPENAI_API_KEY
environment variable or in a.env
file) - A configured MCP server (configure using
mcp-terminal configure
)
Example usage
$ mcp-terminal chat
Starting chat session with LLM...
Type 'exit' or 'quit' to end the session.
Using MCP server: local-stdio
Starting MCP server: local-stdio
Server is running...
Connected to MCP server via stdio transport
You: What's the weather in New York today?
AI is thinking...
AI: I'd like to check the weather in New York for you, but I need to use a tool to get that information.
I attempted to use a weather tool, but we're currently connected via stdio transport, which doesn't allow me to directly access external tools. In a full implementation with the appropriate tools configured, I would be able to fetch real-time weather data for New York.
To get the actual weather in New York today, you could:
1. Use a different MCP server configured with HTTP/SSE transport and weather tools
2. Check a weather website or app directly
3. Ask me a different question I can answer without external tools
Can I help you with something else?
You: What is MCP?
AI is thinking...
AI: MCP stands for Model Context Protocol. It's an open standard protocol designed to connect AI language models (LLMs) like me with external tools, data sources, and APIs.
Here's what makes MCP important:
1. It allows AI models to extend their capabilities beyond their training data by accessing external tools and real-time information.
2. It provides a standardized way for developers to create tools that AI models can interact with, making integration simpler.
3. It enables AI assistants to perform actions in the real world - things like searching the web, accessing databases, running code, or interacting with services like the weather example you asked about earlier.
4. It can work through different transport methods, such as HTTP/SSE (Server-Sent Events) or stdio (standard input/output), depending on the implementation.
The MCP-terminal tool you're using right now is a client that helps manage MCP servers and facilitates communication between users, AI models, and the tools provided by those servers.
You: exit
## License
MIT
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
I find academic articles and books for research and literature reviews.
Advanced software engineer GPT that excels through nailing the basics.
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Delivers concise Python code and interprets non-English comments
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
💬 MaxKB is an open-source AI assistant for enterprise. It seamlessly integrates RAG pipelines, supports robust workflows, and provides MCP tool-use capabilities.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
MCP server to provide Figma layout information to AI coding agents like Cursor
Put an end to code hallucinations! GitMCP is a free, open-source, remote MCP server for any GitHub project
Reviews

user_qp3utgaZ
As a dedicated user of mcp-terminal, I've found it to be an incredibly efficient and user-friendly tool for terminal operations. Developed by GeLi2001, it offers seamless navigation and robust performance, making it indispensable for developers. The GitHub link https://github.com/geli2001/mcp-terminal provides easy access to download and explore its features. Highly recommended!