I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

mcpx-py
Python client library forhttps://mcp.run- call portable & secure tools for your AI Agents and Apps
3 years
Works with Finder
4
Github Watches
1
Github Forks
16
Github Stars
mcpx-py
A Python library for interacting with LLMs using mcp.run tools
Features
AI Provider Support
mcpx-py
supports all models supported by PydanticAI
Dependencies
-
uv
-
npm
-
ollama
(optional)
mcp.run Setup
You will need to get an mcp.run session ID by running:
npx --yes -p @dylibso/mcpx gen-session --write
This will generate a new session and write the session ID to a configuration file that can be used
by mcpx-py
.
If you need to store the session ID in an environment variable you can run gen-session
without the --write
flag:
npx --yes -p @dylibso/mcpx gen-session
which should output something like:
Login successful!
Session: kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ
Then set the MPC_RUN_SESSION_ID
environment variable:
$ export MCP_RUN_SESSION_ID=kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ
Python Usage
Installation
Using uv
:
uv add mcpx-py
Or pip
:
pip install mcpx-py
Example code
from mcpx_py import Chat
llm = Chat("claude-3-5-sonnet-latest")
# Or OpenAI
# llm = Chat("gpt-4o")
# Or Ollama
# llm = Chat("ollama:qwen2.5")
# Or Gemini
# llm = Chat("gemini-2.0-flash")
response = llm.send_message_sync(
"summarize the contents of example.com"
)
print(response.data)
It's also possible to get structured output by setting result_type
from mcpx_py import Chat, BaseModel, Field
from typing import List
class Summary(BaseModel):
"""
A summary of some longer text
"""
source: str = Field("The source of the original_text")
original_text: str = Field("The original text to be summarized")
items: List[str] = Field("A list of summary points")
llm = Chat("claude-3-5-sonnet-latest", result_type=Summary)
response = llm.send_message_sync(
"summarize the contents of example.com"
)
print(response.data)
More examples can be found in the examples/ directory
Command Line Usage
Installation
uv tool install mcpx-py
From git:
uv tool install git+https://github.com/dylibso/mcpx-py
Or from the root of the repo:
uv tool install .
uvx
mcpx-client can also be executed without being installed using uvx
:
uvx --from mcpx-py mcpx-client
Or from git:
uvx --from git+https://github.com/dylibso/mcpx-py mcpx-client
Running
Get usage/help
mcpx-client --help
Chat with an LLM
mcpx-client chat
List tools
mcpx-client list
Call a tool
mcpx-client tool eval-js '{"code": "2+2"}'
LLM Configuration
Provider Setup
Claude
- Sign up for an Anthropic API account at https://console.anthropic.com
- Get your API key from the console
- Set the environment variable:
ANTHROPIC_API_KEY=your_key_here
OpenAI
- Create an OpenAI account at https://platform.openai.com
- Generate an API key in your account settings
- Set the environment variable:
OPENAI_API_KEY=your_key_here
Gemini
- Create an Gemini account at https://aistudio.google.com
- Generate an API key in your account settings
- Set the environment variable:
GEMINI_API_KEY=your_key_here
Ollama
- Install Ollama from https://ollama.ai
- Pull your desired model:
ollama pull llama3.2
- No API key needed - runs locally
Llamafile
- Download a Llamafile model from https://github.com/Mozilla-Ocho/llamafile/releases
- Make the file executable:
chmod +x your-model.llamafile
- Run in JSON API mode:
./your-model.llamafile --json-api --host 127.0.0.1 --port 8080
- Use with the OpenAI provider pointing to
http://localhost:8080
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Oede knorrepot die vasthoudt an de goeie ouwe tied van 't boerenleven
Friendly music guide for 60s-2000s songs, with links to listen online.
A unified API gateway for integrating multiple etherscan-like blockchain explorer APIs with Model Context Protocol (MCP) support for AI assistants.
Mirror ofhttps://github.com/suhail-ak-s/mcp-typesense-server
本项目是一个钉钉MCP(Message Connector Protocol)服务,提供了与钉钉企业应用交互的API接口。项目基于Go语言开发,支持员工信息查询和消息发送等功能。
Short and sweet example MCP server / client implementation for Tools, Resources and Prompts.
Reviews

user_7Oy8a188
I've been using the Procesio MCP Server by Serenichron for a while now, and it has significantly enhanced my workflow efficiency. The server's seamless integration and robust performance stand out, making it an indispensable tool for any developer. Highly recommended!