Cover image
Try Now
2025-03-23

mcp server for mlflow

3 years

Works with Finder

1

Github Watches

2

Github Forks

3

Github Stars

MLflow MCP Server: Natural Language Interface for MLflow

This project provides a natural language interface to MLflow via the Model Context Protocol (MCP). It allows you to query your MLflow tracking server using plain English, making it easier to manage and explore your machine learning experiments and models.

Overview

MLflow MCP Agent consists of two main components:

  1. MLflow MCP Server (mlflow_server.py): Connects to your MLflow tracking server and exposes MLflow functionality through the Model Context Protocol (MCP).

  2. MLflow MCP Client (mlflow_client.py): Provides a natural language interface to interact with the MLflow MCP Server using a conversational AI assistant.

Features

  • Natural Language Queries: Ask questions about your MLflow tracking server in plain English
  • Model Registry Exploration: Get information about your registered models
  • Experiment Tracking: List and explore your experiments and runs
  • System Information: Get status and metadata about your MLflow environment

Prerequisites

  • Python 3.8+
  • MLflow server running (default: http://localhost:8080)
  • OpenAI API key for the LLM

Installation

  1. Clone this repository:

    git clone https://github.com/iRahulPandey/mlflowMCPServer.git
    cd mlflowMCPServer
    
  2. Create a virtual environment:

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
    
  3. Install the required packages:

    pip install mcp[cli] langchain-mcp-adapters langchain-openai langgraph mlflow
    
  4. Set your OpenAI API key:

    export OPENAI_API_KEY=your_key_here
    
  5. (Optional) Configure the MLflow tracking server URI:

    export MLFLOW_TRACKING_URI=http://localhost:8080
    

Usage

Starting the MCP Server

First, start the MLflow MCP server:

python mlflow_server.py

The server connects to your MLflow tracking server and exposes MLflow functionality via MCP.

Making Queries

Once the server is running, you can make natural language queries using the client:

python mlflow_client.py "What models do I have registered in MLflow?"

Example Queries:

  • "Show me all registered models in MLflow"
  • "List all my experiments"
  • "Get details for the model named 'iris-classifier'"
  • "What's the status of my MLflow server?"

Configuration

You can customize the behavior using environment variables:

  • MLFLOW_TRACKING_URI: URI of your MLflow tracking server (default: http://localhost:8080)
  • OPENAI_API_KEY: Your OpenAI API key
  • MODEL_NAME: The OpenAI model to use (default: gpt-3.5-turbo-0125)
  • MLFLOW_SERVER_SCRIPT: Path to the MLflow MCP server script (default: mlflow_server.py)
  • LOG_LEVEL: Logging level (default: INFO)

MLflow MCP Server (mlflow_server.py)

The server connects to your MLflow tracking server and exposes the following tools via MCP:

  • list_models: Lists all registered models in the MLflow model registry
  • list_experiments: Lists all experiments in the MLflow tracking server
  • get_model_details: Gets detailed information about a specific registered model
  • get_system_info: Gets information about the MLflow tracking server and system

Limitations

  • Currently only supports a subset of MLflow functionality
  • The client requires internet access to use OpenAI models
  • Error handling may be limited for complex MLflow operations

Future Improvements

  • Add support for MLflow model predictions
  • Improve the natural language understanding for more complex queries
  • Add visualization capabilities for metrics and parameters
  • Support for more MLflow operations like run management and artifact handling

Acknowledgments

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Khalid kalib
  • Write professional emails

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Yasir Eryilmaz
  • AI scriptwriting assistant for short, engaging video content.

  • J. DE HARO OLLE
  • Especialista en juegos de palabras en varios idiomas.

  • Daren White
  • A supportive coach for mastering all Spanish tenses.

  • albert tan
  • Japanese education, creating tailored learning experiences.

  • apappascs
  • Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.

  • ShrimpingIt
  • Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx

  • huahuayu
  • A unified API gateway for integrating multiple etherscan-like blockchain explorer APIs with Model Context Protocol (MCP) support for AI assistants.

  • deemkeen
  • control your mbot2 with a power combo: mqtt+mcp+llm

  • zhaoyunxing92
  • 本项目是一个钉钉MCP(Message Connector Protocol)服务,提供了与钉钉企业应用交互的API接口。项目基于Go语言开发,支持员工信息查询和消息发送等功能。

  • pontusab
  • The Cursor & Windsurf community, find rules and MCPs

    Reviews

    3 (1)
    Avatar
    user_sDBbEaSN
    2025-04-16

    As a dedicated user of mlflowMCPServer, I am genuinely impressed. This server excels in streamlining machine learning workflows. The seamless integration and easy setup enhanced my productivity significantly. Kudos to iRahulPandey for creating such a robust and user-friendly tool. Highly recommend checking it out!