Cover image
Try Now
2025-04-02

Python client library forhttps://mcp.run- call portable & secure tools for your AI Agents and Apps

3 years

Works with Finder

4

Github Watches

1

Github Forks

16

Github Stars

mcpx-py

PyPI

A Python library for interacting with LLMs using mcp.run tools

Features

AI Provider Support

mcpx-py supports all models supported by PydanticAI

Dependencies

  • uv
  • npm
  • ollama (optional)

mcp.run Setup

You will need to get an mcp.run session ID by running:

npx --yes -p @dylibso/mcpx gen-session --write

This will generate a new session and write the session ID to a configuration file that can be used by mcpx-py.

If you need to store the session ID in an environment variable you can run gen-session without the --write flag:

npx --yes -p @dylibso/mcpx gen-session

which should output something like:

Login successful!
Session: kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ

Then set the MPC_RUN_SESSION_ID environment variable:

$ export MCP_RUN_SESSION_ID=kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ

Python Usage

Installation

Using uv:

uv add mcpx-py

Or pip:

pip install mcpx-py

Example code

from mcpx_py import Chat

llm = Chat("claude-3-5-sonnet-latest")

# Or OpenAI
# llm = Chat("gpt-4o")

# Or Ollama
# llm = Chat("ollama:qwen2.5")

# Or Gemini
# llm = Chat("gemini-2.0-flash")

response = llm.send_message_sync(
    "summarize the contents of example.com"
)
print(response.data)

It's also possible to get structured output by setting result_type

from mcpx_py import Chat, BaseModel, Field
from typing import List

class Summary(BaseModel):
    """
    A summary of some longer text
    """
    source: str = Field("The source of the original_text")
    original_text: str = Field("The original text to be summarized")
    items: List[str] = Field("A list of summary points")

llm = Chat("claude-3-5-sonnet-latest", result_type=Summary)
response = llm.send_message_sync(
    "summarize the contents of example.com"
)
print(response.data)

More examples can be found in the examples/ directory

Command Line Usage

Installation

uv tool install mcpx-py

From git:

uv tool install git+https://github.com/dylibso/mcpx-py

Or from the root of the repo:

uv tool install .

uvx

mcpx-client can also be executed without being installed using uvx:

uvx --from mcpx-py mcpx-client

Or from git:

uvx --from git+https://github.com/dylibso/mcpx-py mcpx-client

Running

Get usage/help

mcpx-client --help

Chat with an LLM

mcpx-client chat

List tools

mcpx-client list

Call a tool

mcpx-client tool eval-js '{"code": "2+2"}'

LLM Configuration

Provider Setup

Claude
  1. Sign up for an Anthropic API account at https://console.anthropic.com
  2. Get your API key from the console
  3. Set the environment variable: ANTHROPIC_API_KEY=your_key_here
OpenAI
  1. Create an OpenAI account at https://platform.openai.com
  2. Generate an API key in your account settings
  3. Set the environment variable: OPENAI_API_KEY=your_key_here
Gemini
  1. Create an Gemini account at https://aistudio.google.com
  2. Generate an API key in your account settings
  3. Set the environment variable: GEMINI_API_KEY=your_key_here
Ollama
  1. Install Ollama from https://ollama.ai
  2. Pull your desired model: ollama pull llama3.2
  3. No API key needed - runs locally
Llamafile
  1. Download a Llamafile model from https://github.com/Mozilla-Ocho/llamafile/releases
  2. Make the file executable: chmod +x your-model.llamafile
  3. Run in JSON API mode: ./your-model.llamafile --json-api --host 127.0.0.1 --port 8080
  4. Use with the OpenAI provider pointing to http://localhost:8080

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Khalid kalib
  • Write professional emails

  • https://tovuti.be
  • Oede knorrepot die vasthoudt an de goeie ouwe tied van 't boerenleven

  • Gil kaminski
  • Make sure you are post-ready before you post on social media

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • momi
  • Provides initial medical assessments and advice.

  • WILLIAM JOEL FOTEPING
  • Friendly music guide for 60s-2000s songs, with links to listen online.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Yasir Eryilmaz
  • AI scriptwriting assistant for short, engaging video content.

  • huahuayu
  • A unified API gateway for integrating multiple etherscan-like blockchain explorer APIs with Model Context Protocol (MCP) support for AI assistants.

  • deemkeen
  • control your mbot2 with a power combo: mqtt+mcp+llm

  • zhaoyunxing92
  • 本项目是一个钉钉MCP(Message Connector Protocol)服务,提供了与钉钉企业应用交互的API接口。项目基于Go语言开发,支持员工信息查询和消息发送等功能。

  • justmywyw
  • Short and sweet example MCP server / client implementation for Tools, Resources and Prompts.

  • sligter
  • Lite-MCP-Client是一个基于命令行的轻量级MCP客户端工具

    Reviews

    4 (1)
    Avatar
    user_7Oy8a188
    2025-04-15

    I've been using the Procesio MCP Server by Serenichron for a while now, and it has significantly enhanced my workflow efficiency. The server's seamless integration and robust performance stand out, making it an indispensable tool for any developer. Highly recommended!