
mcp-web-ui
MCP Web UI is a web-based user interface that serves as a Host within the Model Context Protocol (MCP) architecture. It provides a powerful and user-friendly interface for interacting with Large Language Models (LLMs) while managing context aggregation and coordination between clients and servers.
3 years
Works with Finder
2
Github Watches
8
Github Forks
39
Github Stars
MCP Web UI
MCP Web UI is a web-based user interface that serves as a Host within the Model Context Protocol (MCP) architecture. It provides a powerful and user-friendly interface for interacting with Large Language Models (LLMs) while managing context aggregation and coordination between clients and servers.
🌟 Overview
MCP Web UI is designed to simplify and enhance interactions with AI language models by providing:
- A unified interface for multiple LLM providers
- Real-time, streaming chat experiences
- Flexible configuration and model management
- Robust context handling using the MCP protocol
Demo Video
🚀 Features
- 🤖 Multi-Provider LLM Integration:
- Anthropic (Claude models)
- OpenAI (GPT models)
- Ollama (local models)
- OpenRouter (multiple providers)
- 💬 Intuitive Chat Interface
- 🔄 Real-time Response Streaming via Server-Sent Events (SSE)
- 🔧 Dynamic Configuration Management
- 📊 Advanced Context Aggregation
- 💾 Persistent Chat History using BoltDB
- 🎯 Flexible Model Selection
📋 Prerequisites
- Go 1.23+
- Docker (optional)
- API keys for desired LLM providers
🛠 Installation
Quick Start
-
Clone the repository:
git clone https://github.com/MegaGrindStone/mcp-web-ui.git cd mcp-web-ui
-
Configure your environment:
mkdir -p $HOME/.config/mcpwebui cp config.example.yaml $HOME/.config/mcpwebui/config.yaml
-
Set up API keys:
export ANTHROPIC_API_KEY=your_anthropic_key export OPENAI_API_KEY=your_openai_key export OPENROUTER_API_KEY=your_openrouter_key
Running the Application
Local Development
go mod download
go run ./cmd/server/main.go
Docker Deployment
docker build -t mcp-web-ui .
docker run -p 8080:8080 \
-v $HOME/.config/mcpwebui/config.yaml:/app/config.yaml \
-e ANTHROPIC_API_KEY \
-e OPENAI_API_KEY \
-e OPENROUTER_API_KEY \
mcp-web-ui
🔧 Configuration
The configuration file (config.yaml
) provides comprehensive settings for customizing the MCP Web UI. Here's a detailed breakdown:
Server Configuration
-
port
: The port on which the server will run (default: 8080) -
logLevel
: Logging verbosity (options: debug, info, warn, error; default: info) -
logMode
: Log output format (options: json, text; default: text)
Prompt Configuration
-
systemPrompt
: Default system prompt for the AI assistant -
titleGeneratorPrompt
: Prompt used to generate chat titles
LLM (Language Model) Configuration
The llm
section supports multiple providers with provider-specific configurations:
Common LLM Parameters
-
provider
: Choose from: ollama, anthropic, openai, openrouter -
model
: Specific model name (e.g., 'claude-3-5-sonnet-20241022') -
parameters
: Fine-tune model behavior:-
temperature
: Randomness of responses (0.0-1.0) -
topP
: Nucleus sampling threshold -
topK
: Number of highest probability tokens to keep -
frequencyPenalty
: Reduce repetition of token sequences -
presencePenalty
: Encourage discussing new topics -
maxTokens
: Maximum response length -
stop
: Sequences to stop generation - And more provider-specific parameters
-
Provider-Specific Configurations
-
Ollama:
-
host
: Ollama server URL (default: http://localhost:11434)
-
-
Anthropic:
-
apiKey
: Anthropic API key (can use ANTHROPIC_API_KEY env variable) -
maxTokens
: Maximum token limit - Note: Stop sequences containing only whitespace are ignored, and whitespace is trimmed from valid sequences as Anthropic doesn't support whitespace in stop sequences
-
-
OpenAI:
-
apiKey
: OpenAI API key (can use OPENAI_API_KEY env variable) -
endpoint
: OpenAI API endpoint (default: https://api.openai.com/v1) - For using alternative OpenAI-compatible APIs, see this discussion thread
-
-
OpenRouter:
-
apiKey
: OpenRouter API key (can use OPENROUTER_API_KEY env variable)
-
Title Generator Configuration
The genTitleLLM
section allows separate configuration for title generation, defaulting to the main LLM if not specified.
MCP Server Configurations
-
mcpSSEServers
: Configure Server-Sent Events (SSE) servers-
url
: SSE server URL -
maxPayloadSize
: Maximum payload size
-
-
mcpStdIOServers
: Configure Standard Input/Output servers-
command
: Command to run server -
args
: Arguments for the server command
-
Example MCP Server Configurations
SSE Server Example:
mcpSSEServers:
filesystem:
url: https://yoursseserver.com
maxPayloadSize: 1048576 # 1MB
StdIO Server Examples:
- Using the official filesystem MCP server:
mcpStdIOServers:
filesystem:
command: npx
args:
- -y
- "@modelcontextprotocol/server-filesystem"
- "/path/to/your/files"
This example can be used directly as the official filesystem mcp server is an executable package that can be run with npx. Just update the path to point to your desired directory.
- Using go-mcp filesystem MCP server:
mcpStdIOServers:
filesystem:
command: go
args:
- run
- github.com/your_username/your_app # Replace with your app
- -path
- "/data/mcp/filesystem" # Path to expose to MCP clients
For this example, you'll need to create a new Go application that imports the github.com/MegaGrindStone/go-mcp/servers/filesystem
package. The flag naming (like -path
in this example) is completely customizable based on how you structure your own application - it doesn't have to be called "path". This example is merely a starting point showing one possible implementation where a flag is used to specify which directory to expose. You're free to design your own application structure and command-line interface according to your specific needs.
Example Configuration Snippet
port: 8080
logLevel: info
systemPrompt: You are a helpful assistant.
llm:
provider: anthropic
model: claude-3-5-sonnet-20241022
maxTokens: 1000
parameters:
temperature: 0.7
genTitleLLM:
provider: openai
model: gpt-3.5-turbo
🏗 Project Structure
-
cmd/
: Application entry point -
internal/handlers/
: Web request handlers -
internal/models/
: Data models -
internal/services/
: LLM provider integrations -
static/
: Static assets (CSS) -
templates/
: HTML templates
🤝 Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push and create a Pull Request
📄 License
MIT License
相关推荐
I find academic articles and books for research and literature reviews.
Converts Figma frames into front-end code for various mobile frameworks.
Confidential guide on numerology and astrology, based of GG33 Public information
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Advanced software engineer GPT that excels through nailing the basics.
Delivers concise Python code and interprets non-English comments
💬 MaxKB is a ready-to-use AI chatbot that integrates Retrieval-Augmented Generation (RAG) pipelines, supports robust workflows, and provides advanced MCP tool-use capabilities.
MCP server to provide Figma layout information to AI coding agents like Cursor
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
AI Agents & MCPs & AI Workflow Automation • (280+ MCP servers for AI agents) • AI Automation / AI Agent with MCPs • AI Workflows & AI Agents • MCPs for AI Agents
Reviews

user_Ycdujvh5
As a dedicated user of MCP Web UI by MegaGrindStone, I can confidently say it is a remarkable tool for web interface development. Its intuitive design and comprehensive features have significantly streamlined my project workflows. The documentation is clear, making the setup process smooth even for beginners. Highly recommend checking it out at https://github.com/MegaGrindStone/mcp-web-ui!