
nba_mcp_langgraph
3 years
Works with Finder
0
Github Watches
0
Github Forks
0
Github Stars
NBA Machine Learning Prediction (MCP) Platform with LangGraph Integration
A state-of-the-art FastAPI application for NBA data analysis and prediction leveraging LangGraph's AI agent workflows. This platform combines real-time NBA data with advanced machine learning techniques to deliver insights and predictions.
🌟 Features
-
NBA Data Integration
- Real-time NBA game data and statistics
- Historical player and team performance tracking
- Live scoreboard and play-by-play analysis
- Comprehensive league leaders and career statistics
-
Production-Ready Architecture
- FastAPI for high-performance async API endpoints
- LangGraph integration for AI agent workflows
- Langfuse for LLM observability and monitoring
- Structured logging with environment-specific formatting
- Rate limiting with configurable rules
- PostgreSQL for data persistence
- Docker and Docker Compose support
- Prometheus metrics and Grafana dashboards for monitoring
-
Advanced NBA Analytics
- Player career statistics analysis
- Team performance metrics
- League leaders tracking
- Game log data exploration
- Play-by-play breakdown and analysis
-
Security
- JWT-based authentication
- Session management
- Input sanitization
- CORS configuration
- Rate limiting protection
-
Developer Experience
- Environment-specific configuration
- Comprehensive logging system
- Clear project structure
- Type hints throughout
- Easy local development setup
-
Model Evaluation Framework
- Automated metric-based evaluation of model outputs
- Integration with Langfuse for trace analysis
- Detailed JSON reports with success/failure metrics
- Interactive command-line interface
- Customizable evaluation metrics
🏷️ Ports & Endpoints
Service | Port | Purpose / Endpoint |
---|---|---|
App (FastAPI) | 8000 | Main API & MCP SSE/WSS (default “claude” mode) |
8001 | Alternative “local” SSE/WSS mode | |
8000 | Swagger UI: http://localhost:8000/docs |
|
PostgreSQL DB | 5432 | db service; connection string: |
postgresql://postgres:mysecretpw@db:5432/nba_mcp_dev |
||
Prometheus | 9090 | Metrics scrape target: http://app:8000/metrics |
Prometheus config: | prometheus/prometheus.yml |
|
Grafana | 3000 | Dashboards: http://localhost:3000 Admin/admin |
🚀 Quick Start
1. Prerequisites
- Python 3.13+
- PostgreSQL
- Docker & Docker Compose (optional)
2. Clone & Environment Setup
git clone https://github.com/ghadfield32/nba_mcp_langgraph.git
cd nba_mcp_langgraph
Create and activate your virtual environment:
uv sync # Creates and activates the .venv (cross-platform)
Note:
uv sync
will auto-activate the venv in most shells, including PowerShell and Bash.
If it doesn't, you can manually activate:
-
Windows (PowerShell)
.\.venv\Scripts\Activate.ps1
-
Windows (Command Prompt)
.\.venv\Scripts\activate.bat
-
macOS/Linux (Bash/Zsh)
source .venv/bin/activate
Copy the example environment file and update values for your own setup (do not commit your personal secrets):
cp .env.example .env.development # or .env.staging / .env.production
Open the newly created .env.development
file and update ONLY the placeholder values:
APP_ENV=development
NBA_MCP_PORT=8000
POSTGRES_URL=postgresql://postgres:mysecretpw@db:5432/nba_mcp_dev
LLM_API_KEY=<your-llm-key>
JWT_SECRET_KEY=<your-jwt-secret>
```### 3. Database Setup Database Setup
Start the database:
```bash
docker compose up -d db
Verify it's healthy:
docker ps # look for "nba-db-dev" on port 5432 with healthy status
If needed, manually apply
schema.sql
:sqlite3 nba_mcp_dev.db < schema.sql
4. Running the Application
Development Mode (with auto-reload)
# option 1: Invoke task
inv dev
# option 2: Makefile\make dev
-
Environment:
.env.development
- Port: 8000
-
Reload: enabled (
--reload
) - Logging: DEBUG, human-readable console
Watch the console for:
INFO: Will watch for changes in these directories...
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
Loading environment: Environment.DEVELOPMENT
Loaded environment from .env.development
...application startup complete.
Production Mode (no reload)
# locally without Docker
inv prod # or make prod
-
Environment:
.env.production
- Port: 8000
- Reload: disabled
- Logging: WARNING+, JSON format
Or bring up the full stack via Docker Compose:
APP_ENV=production docker compose up -d --build
This will start:
- app (FastAPI + MCP) on port 8000
- db (PostgreSQL) on port 5432
- prometheus on port 9090
- grafana on port 3000
5. Accessing Dashboards & Endpoints
- Swagger UI: http://localhost:8000/docs
- FastAPI Root: http://localhost:8000/
-
Prometheus: http://localhost:9090/targets (scrapes
/metrics
) - Grafana: http://localhost:3000 (admin/admin)
- Raw Metrics: http://localhost:8000/metrics
Grafana Dashboards
We pre‑load four dashboards under Grafana:
-
API Performance (
api_performance.json
):
HTTP QPS, 95th‑pct latency, errors by endpoint. -
Rate Limiting (
rate_limiting.json
):
Per‑endpoint and global rate‑limit counters. -
Database Performance (
db_performance.json
):
Connection pool usage, active connections, query timing. -
System Resource Usage (
system_usage.json
):
Host CPU, memory, disk I/O, network (requires node‑exporter).
Automatic Provisioning
If you prefer automatic loading (recommended for production), copy the grafana/provisioning/
directory:
cp -r ../../fastapi-langgraph-agent-production-ready-template/grafana/provisioning ./grafana/provisioning
cp -r ../../fastapi-langgraph-agent-production-ready-template/grafana/dashboards ./grafana/dashboards
Then update docker-compose.yml to mount:
grafana: … volumes: - ./grafana/provisioning:/etc/grafana/provisioning - ./grafana/dashboards:/var/lib/grafana/dashboards
Restart Grafana:
docker compose restart grafana
Your dashboards will now automatically appear under Dashboards → Home. Manual Import
Alternatively, you can import JSON manually via + → Import in the Grafana UI, selecting Prometheus as the data source.
💡 5. Single Versus Multiple Dashboards?
We recommend using Grafana as a single pane of glass—it can host all four dashboards (plus any future ones) in one UI. That keeps everything centralized:
- One login, one data source configuration, and consistent alerts.
- Easily create a "folder" inside Grafana called "NBA MCP" to group these dashboards.
With these README updates, new team members can immediately spin up our full monitoring stack and have rich, production‑grade dashboards at their fingertips.
🛠️ Infrastructure Components
PostgreSQL
PostgreSQL is a powerful, open‑source object‑relational database management system with full ACID compliance, extensibility, and robust concurrency control, making it ideal for production environments. :contentReference[oaicite:0]{index=0}
It offers advanced features like JSON data types, full‑text search, and multiple index methods to handle diverse data workloads efficiently. :contentReference[oaicite:1]{index=1}
Its active community and rich ecosystem ensure continuous improvements, a wealth of extensions (e.g., PostGIS), and long‑term reliability. :contentReference[oaicite:2]{index=2}
Uvicorn (uv)
Uvicorn is a lightning‑fast ASGI server implementation for Python designed to run frameworks like FastAPI. :contentReference[oaicite:3]{index=3}
It implements the ASGI specification using uvloop
and httptools
, supports HTTP/2 and WebSockets, and enables true asynchronous request handling for modern APIs. :contentReference[oaicite:4]{index=4}
Prometheus
Prometheus is an open‑source monitoring system and time‑series database originally built at SoundCloud. :contentReference[oaicite:5]{index=5}
It scrapes metrics from instrumented applications via a pull model, storing them efficiently with a multi‑dimensional data model. :contentReference[oaicite:6]{index=6}
Prometheus includes a powerful query language (PromQL) for real‑time aggregation, alerting, and visualization. :contentReference[oaicite:7]{index=7}
Its standalone architecture ensures high reliability, minimal external dependencies, and straightforward setup for microservices monitoring. :contentReference[oaicite:8]{index=8}
Grafana
Grafana is an open‑source analytics and visualization platform that lets you create interactive, dynamic dashboards. :contentReference[oaicite:9]{index=9}
It integrates seamlessly with multiple data sources—like Prometheus, PostgreSQL, and more—allowing you to correlate metrics across your stack. :contentReference[oaicite:10]{index=10}
Grafana's rich ecosystem of plugins, templating, and built‑in alerting enables tailored monitoring views and proactive operational insights. :contentReference[oaicite:11]{index=11}
🚀 Standalone Ollama + LangGraph Demo
# 1. Start Ollama:
ollama serve --port 11434
# 2. In one terminal, run your NBA MCP server:
inv dev # or `python -m nba_mcp --transport sse`
# 3. In another terminal, run the demo:
python examples/langgraph_ollama_agent_w_tools.py --mode local
## 🛠️ Adding New MCP Tools
If you'd like to extend the NBA MCP server with your own tools:
1. **Define a Pydantic model** for any structured parameters (optional):
```python
from pydantic import BaseModel, Field
class MyToolParams(BaseModel):
team: str = Field(..., description="Team abbreviation like 'LAL'")
limit: int = Field(10, ge=1, description="Number of records")
-
Add a new tool function in
nba_server.py
:@mcp_server.tool() async def get_top_players(params: MyToolParams) -> str: # Your logic here, e.g. call NBAApiClient data = await client.get_top_players(params.team, limit=params.limit) return json.dumps(data)
-
Mount it to HTTP (optional) by adding a FastAPI route:
@router.get("/top_players/{team}") async def top_players(team: str, limit: int = Query(10)): params = MyToolParams(team=team, limit=limit) return await get_top_players(params)
-
Reload the server (in dev) or rebuild your Docker image (in prod).
Your new tool will now be discoverable via MCP's /messages/
SSE/WSS and as a standard HTTP endpoint under /api/v1/mcp/nba/
.
NBA MCP - Two-Port Architecture
This project implements a FastMCP server for NBA data with a two-port architecture that separates the main API server from the Server-Sent Events (SSE) server.
Overview
The application uses two separate ports for better separation of concerns:
- Main API Server (port 8000 by default): Handles regular HTTP requests, API endpoints, and documentation.
- SSE Server (port 8001 by default): Dedicated to Server-Sent Events for real-time data streaming.
This approach provides several benefits:
- Clear separation between HTTP API and event streaming concerns
- No path conflicts or complex mounting logic
- Simpler debugging and testing
- Better scalability (each server can be scaled independently)
Configuration
The port configuration is controlled by environment variables:
# Main API port (default: 8000)
NBA_MCP_PORT=8000
# SSE server port (default: 8001)
NBA_MCP_SSE_PORT=8001
# Host to bind (default: 0.0.0.0)
FASTMCP_SSE_HOST=0.0.0.0
Running the Servers
Option 1: Using the helper script (recommended)
The simplest way to run both servers is using the provided helper script:
python start_servers.py
This script starts both servers in parallel and handles logging and shutdown for you.
Option 2: Running servers individually
If you prefer to run the servers in separate terminals:
-
Start the Main API server:
uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
-
Start the SSE server:
python run_sse.py --mode local
Running the Example Agent
Once both servers are running, you can run the example Langgraph agent:
python examples/langgraph_ollama_agent_w_tools.py --mode local
The example automatically connects to both servers:
- Main API at
http://localhost:8000
- SSE server at
http://localhost:8001
API Documentation
- OpenAPI documentation: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
Debugging
If you encounter issues:
-
Verify both servers are running on their respective ports:
curl http://localhost:8000/health curl http://localhost:8001/sse
-
Check environment variables:
echo $NBA_MCP_PORT echo $NBA_MCP_SSE_PORT
-
Enable debug mode for more verbose logging:
python run_sse.py --mode local --debug
Architecture Details
Main Components
-
FastAPI Application (
app/main.py
):- Runs on NBA_MCP_PORT (default 8000)
- Handles HTTP API requests, documentation, and other web services
-
SSE Server (
run_sse.py
):- Runs on NBA_MCP_SSE_PORT (default 8001)
- Dedicated to SSE events using FastMCP
- Handles real-time data streaming
-
NBA MCP Server (
app/services/mcp/nba_mcp/nba_server.py
):- Creates and configures the FastMCP server
- Dynamically chooses port based on mode (local/claude)
-
Example Client (
examples/langgraph_ollama_agent_w_tools.py
):- Connects to both servers
- Demonstrates usage of NBA MCP tools via Langgraph
相关推荐
🔥 1Panel provides an intuitive web interface and MCP Server to manage websites, files, containers, databases, and LLMs on a Linux server.
PDF scientific paper translation with preserved formats - 基于 AI 完整保留排版的 PDF 文档全文双语翻译,支持 Google/DeepL/Ollama/OpenAI 等服务,提供 CLI/GUI/MCP/Docker/Zotero
Easily create LLM tools and agents using plain Bash/JavaScript/Python functions.
Artifact2MCP Generator allows generation of MCP server automatically & dynamically given smart contract's compiled artifact (chain‑agnostic)
Reviews

user_D2XYSpgg
As a dedicated user of the nba_mcp_langgraph by ghadfield32, I must say this tool is a game-changer! It provides insightful visual language processing tailored for NBA content, making it easier to analyze and understand player and team trends. High-quality and user-friendly, it’s a must-have for any NBA data enthusiast. Highly recommend!

user_ul37d7YG
I recently used the nba_mcp_langgraph developed by ghadfield32, and it’s been fantastic. The insights and analytics provided are incredibly detailed and have enhanced my understanding of NBA games. It's user-friendly and the data visualization is top-notch. Highly recommend for any NBA fan or analyst!

user_JjCy0Emz
As a dedicated user of the NBA Language Graph MCP application by ghadfield32, I must say it's excellent! The interface is intuitive, and it seamlessly integrates various language features to enhance my NBA experience. This tool is a game-changer for basketball enthusiasts looking to dive deeper into data and analytics. Highly recommended!

user_Vpz0YsVZ
I'm a huge fan of the nba_mcp_langgraph by ghadfield32! This application provides incredible insights and detailed analysis for NBA enthusiasts, blending language processing with basketball stats seamlessly. The intuitive interface and the depth of information make it a must-have for anyone interested in the NBA. Highly recommended!

user_L3dlWvKS
As a dedicated user of the nba_mcp_langgraph application by ghadfield32, I must say it’s an exceptional tool for NBA analytics. The seamless integration and user-friendly interface make it a breeze to track language patterns and trends among players and commentators. Highly recommended for anyone serious about understanding the linguistic dynamics in NBA culture!

user_FjsF2f4g
I've been using the nba_mcp_langgraph application created by ghadfield32, and it's absolutely amazing! The interface is user-friendly and makes tracking NBA players' performance through graphs extremely easy. This tool is perfect for any NBA enthusiast who loves data visualization. Highly recommend!