I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

LMStudio-MCP
A Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.
3 years
Works with Finder
1
Github Watches
2
Github Forks
15
Github Stars
LMStudio-MCP
A Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.
Overview
LMStudio-MCP creates a bridge between Claude (with MCP capabilities) and your locally running LM Studio instance. This allows Claude to:
- Check the health of your LM Studio API
- List available models
- Get the currently loaded model
- Generate completions using your local models
This enables you to leverage your own locally running models through Claude's interface, combining Claude's capabilities with your private models.
Prerequisites
- Python 3.7+
- LM Studio installed and running locally with a model loaded
- Claude with MCP access
- Required Python packages (see Installation)
Installation
-
Clone this repository:
git clone https://github.com/infinitimeless/LMStudio-MCP.git cd LMStudio-MCP
-
Install the required packages:
pip install requests "mcp[cli]" openai
MCP Configuration
For Claude to connect to this bridge, you need to configure the MCP settings properly. You can either:
-
Use directly from GitHub:
{ "lmstudio-mcp": { "command": "uvx", "args": [ "https://github.com/infinitimeless/LMStudio-MCP" ] } }
-
Use local installation:
{ "lmstudio-mcp": { "command": "/bin/bash", "args": [ "-c", "cd /path/to/LMStudio-MCP && source venv/bin/activate && python lmstudio_bridge.py" ] } }
For detailed MCP configuration instructions, see MCP_CONFIGURATION.md.
Usage
-
Start your LM Studio application and ensure it's running on port 1234 (the default)
-
Load a model in LM Studio
-
If running locally (not using
uvx
), run the LMStudio-MCP server:python lmstudio_bridge.py
-
In Claude, connect to the MCP server when prompted by selecting "lmstudio-mcp"
Available Functions
The bridge provides the following functions:
-
health_check()
: Verify if LM Studio API is accessible -
list_models()
: Get a list of all available models in LM Studio -
get_current_model()
: Identify which model is currently loaded -
chat_completion(prompt, system_prompt, temperature, max_tokens)
: Generate text from your local model
Known Limitations
- Some models (e.g., phi-3.5-mini-instruct_uncensored) may have compatibility issues
- The bridge currently uses only the OpenAI-compatible API endpoints of LM Studio
- Model responses will be limited by the capabilities of your locally loaded model
Troubleshooting
API Connection Issues
If Claude reports 404 errors when trying to connect to LM Studio:
- Ensure LM Studio is running and has a model loaded
- Check that LM Studio's server is running on port 1234
- Verify your firewall isn't blocking the connection
- Try using "127.0.0.1" instead of "localhost" in the API URL if issues persist
Model Compatibility
If certain models don't work correctly:
- Some models might not fully support the OpenAI chat completions API format
- Try different parameter values (temperature, max_tokens) for problematic models
- Consider switching to a more compatible model if problems persist
For more detailed troubleshooting help, see TROUBLESHOOTING.md.
License
MIT
Acknowledgements
This project was originally developed as "Claude-LMStudio-Bridge_V2" and has been renamed and open-sourced as "LMStudio-MCP".
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
A unified API gateway for integrating multiple etherscan-like blockchain explorer APIs with Model Context Protocol (MCP) support for AI assistants.
Mirror ofhttps://github.com/suhail-ak-s/mcp-typesense-server
本项目是一个钉钉MCP(Message Connector Protocol)服务,提供了与钉钉企业应用交互的API接口。项目基于Go语言开发,支持员工信息查询和消息发送等功能。
Short and sweet example MCP server / client implementation for Tools, Resources and Prompts.
Reviews

user_0z5PXnPy
As a devoted user of LMStudio-MCP, I can confidently say it significantly streamlines my workflow. Authored by infinitimeless and accessible on GitHub, this app offers remarkable functionality for managing projects efficiently. Its user-friendly interface and robust performance have made it an indispensable tool in my daily tasks. Highly recommended for anyone looking to enhance productivity!