Cover image
Try Now
2025-04-14

3 years

Works with Finder

0

Github Watches

0

Github Forks

0

Github Stars

Supabase MCP Server

This project implements a Model Context Protocol (MCP) server that provides tools for interacting with a Supabase database. The server enables AI assistants to perform database operations through a standardized interface.

Setup

  1. Clone the repository:

    git clone <repository_url>
    cd <repository_name>
    
  2. Install the dependencies:

    pip install -r requirements.txt
    
  3. The server now accepts Supabase credentials from client requests instead of requiring environment variables. Clients should provide these in their request:

{
    "supabase_url": "your_supabase_url",
    "supabase_key": "your_supabase_service_role_key",
    "tool": "create_table",
    "arguments": {
        "table_name": "users",
        "schema": [
            {"name": "id", "type": "SERIAL PRIMARY KEY"},
            {"name": "name", "type": "TEXT"}
        ]
    }
}

Docker

  1. Build the Docker container:

    docker build -t mcp-server .
    
  2. Run the container:

    docker run -p 8000:8000 mcp-server
    

    This will build an image named mcp-server and then run it, mapping port 8000 on your host to port 8000 in the container.

GitHub Repository

To push the files to a GitHub repository, follow these steps:

  1. Initialize a Git repository in the current directory:

    git init
    
  2. Add all the files to the staging area:

    git add .
    
  3. Commit the changes:

    git commit -m "Initial commit"
    
  4. Rename the branch to main:

    git branch -M main
    
  5. Add the remote repository:

    git remote add origin https://github.com/amraly83/supabase-mcp-server.git
    
  6. Push the files to the repository:

    git push -u origin main
    

    You may be prompted for your GitHub username and password. If you have 2FA enabled, you'll need to use a personal access token instead of your password.

Usage

The server provides the following tools:

  • read_rows(table_name: str = None, limit: int = 10): Reads rows from a Supabase table.
    • table_name: The name of the table to read from. This is an optional parameter. If no table name is provided, it will return rows from all tables.
    • limit: The maximum number of rows to return. Defaults to 10. This is an optional parameter.
  • create_record(table_name: str, record: dict): Creates a new record in a Supabase table.
    • table_name: The name of the table to create the record in.
    • record: A dictionary containing the data for the new record.
  • update_record(table_name: str, record_id: int, updates: dict): Updates an existing record in a Supabase table.
    • table_name: The name of the table to update the record in.
    • record_id: The ID of the record to update.
    • updates: A dictionary containing the updates to apply to the record.
  • delete_record(table_name: str, record_id: int): Deletes a record from a Supabase table.
    • table_name: The name of the table to delete the record from.
    • record_id: The ID of the record to delete.
  • list_tables(): Lists all tables in the Supabase database.
  • create_table(table_name: str, schema: list): Creates a new table in the Supabase database.
    • table_name: The name of the table to create.
    • schema: A list of dictionaries, where each dictionary represents a column in the table. Each dictionary must have the keys "name" and "type".

Example

To use the server, you can send JSON requests to the server's standard input. For example, to read the first 5 rows from a table named "products", you would send the following JSON:

{
    "supabase_url": "your_supabase_url",
    "supabase_key": "your_supabase_service_role_key",
    "tool": "read_rows",
    "arguments": {
        "table_name": "products",
        "limit": 5
    }
}

To list all tables in the Supabase database, you would send the following JSON:

{
    "supabase_url": "your_supabase_url",
    "supabase_key": "your_supabase_service_role_key",
    "tool": "list_tables",
    "arguments": {}
}

To create a new table named "users" with columns "id" (SERIAL PRIMARY KEY) and "name" (TEXT), you would send the following JSON:

{
    "supabase_url": "your_supabase_url",
    "supabase_key": "your_supabase_service_role_key",
    "tool": "create_table",
    "arguments": {
        "table_name": "users",
        "schema": [
            {"name": "id", "type": "SERIAL PRIMARY KEY"},
            {"name": "name", "type": "TEXT"}
        ]
    }
}

相关推荐

  • 1Panel-dev
  • 🔥 1Panel provides an intuitive web interface and MCP Server to manage websites, files, containers, databases, and LLMs on a Linux server.

  • Byaidu
  • PDF scientific paper translation with preserved formats - 基于 AI 完整保留排版的 PDF 文档全文双语翻译,支持 Google/DeepL/Ollama/OpenAI 等服务,提供 CLI/GUI/MCP/Docker/Zotero

  • sigoden
  • Easily create LLM tools and agents using plain Bash/JavaScript/Python functions.

  • tommyming
  • Just getting some fun to build a mcp version using swift.

  • paulwing
  • A test repository created using MCP service

  • tawago
  • Artifact2MCP Generator allows generation of MCP server automatically & dynamically given smart contract's compiled artifact (chain‑agnostic)

  • ragu6963
  • hkr04
  • Lightweight C++ MCP (Model Context Protocol) SDK

    Reviews

    3.3 (3)
    Avatar
    user_ygLckTOf
    2025-04-24

    I've been using Supabase-MCPServer from amraly83 and it's been a game-changer for my applications. The setup was straightforward and the performance has been impressive. Highly recommend for anyone needing a reliable and efficient MCP solution!

    Avatar
    user_x8BgddA9
    2025-04-24

    As a dedicated MCP application user, I can confidently say that the Supabase-MCPserver by amraly83 is a game-changer. It integrates seamlessly and offers exceptional performance. The user-friendly interface and robust features make it an indispensable tool for any project. Highly recommend!

    Avatar
    user_PvaFjbYt
    2025-04-24

    As a devoted user of MCP applications, I find supabase-mcpserver by amraly83 indispensable. Its robust functionality and seamless integration make data management a breeze. Highly recommend this efficient tool for anyone looking to streamline their workflow and enhance productivity.