Cover image

数据库的MCP工具箱是用于数据库的开源MCP服务器,该数据库是考虑到企业质量和生产级用法的设计和构建的。

3 years

Works with Finder

22

Github Watches

74

Github Forks

602

Github Stars

logo

MCP Toolbox for Databases

[!NOTE] MCP Toolbox for Databases is currently in beta, and may see breaking changes until the first stable release (v1.0).

MCP Toolbox for Databases is an open source MCP server for databases It was designed with enterprise-grade and production-quality in mind. It enables you to develop tools easier, faster, and more securely by handling the complexities such as connection pooling, authentication, and more.

This README provides a brief overview. For comprehensive details, see the full documentation.

[!NOTE] This product was originally named “Gen AI Toolbox for Databases” as its initial development predated MCP, but was renamed to align with recently added MCP compatibility.

Table of Contents

Why Toolbox?

Toolbox helps you build Gen AI tools that let your agents access data in your database. Toolbox provides:

  • Simplified development: Integrate tools to your agent in less than 10 lines of code, reuse tools between multiple agents or frameworks, and deploy new versions of tools more easily.
  • Better performance: Best practices such as connection pooling, authentication, and more.
  • Enhanced security: Integrated auth for more secure access to your data
  • End-to-end observability: Out of the box metrics and tracing with built-in support for OpenTelemetry.

General Architecture

Toolbox sits between your application's orchestration framework and your database, providing a control plane that is used to modify, distribute, or invoke tools. It simplifies the management of your tools by providing you with a centralized location to store and update tools, allowing you to share tools between agents and applications and update those tools without necessarily redeploying your application.

architecture

Getting Started

Installing the server

For the latest version, check the releases page and use the following instructions for your OS and CPU architecture.

Binary

To install Toolbox as a binary:

# see releases page for other versions
export VERSION=0.3.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
Container image You can also install Toolbox as a container:
# see releases page for other versions
export VERSION=0.3.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
Compile from source

To install from source, ensure you have the latest version of Go installed, and then run the following command:

go install github.com/googleapis/genai-toolbox@v0.3.0

Running the server

Configure a tools.yaml to define your tools, and then execute toolbox to start the server:

./toolbox --tools_file "tools.yaml"

You can use toolbox help for a full list of flags! To stop the server, send a terminate signal (ctrl+c on most platforms).

For more detailed documentation on deploying to different environments, check out the resources in the How-to section

Integrating your application

Once your server is up and running, you can load the tools into your application. See below the list of Client SDKs for using various frameworks:

Core
  1. Install Toolbox Core SDK:
    pip install toolbox-core
    
  2. Load tools:
    from toolbox_core import ToolboxClient
    
    # update the url to point to your server
    client = ToolboxClient("http://127.0.0.1:5000")
    
    # these tools can be passed to your application! 
    tools = await client.load_toolset("toolset_name")
    

For more detailed instructions on using the Toolbox Core SDK, see the project's README.

LangChain / LangGraph
  1. Install Toolbox LangChain SDK:
    pip install toolbox-langchain
    
  2. Load tools:
    from toolbox_langchain import ToolboxClient
    
    # update the url to point to your server
    client = ToolboxClient("http://127.0.0.1:5000")
    
    # these tools can be passed to your application! 
    tools = client.load_toolset()
    

For more detailed instructions on using the Toolbox LangChain SDK, see the project's README.

LlamaIndex
  1. Install Toolbox Llamaindex SDK:
    pip install toolbox-llamaindex
    
  2. Load tools:
    from toolbox_llamaindex import ToolboxClient
    
    # update the url to point to your server
    client = ToolboxClient("http://127.0.0.1:5000")
    
    # these tools can be passed to your application! 
    tools = client.load_toolset()
    

For more detailed instructions on using the Toolbox Llamaindex SDK, see the project's README.

Configuration

The primary way to configure Toolbox is through the tools.yaml file. If you have multiple files, you can tell toolbox which to load with the --tools_file tools.yaml flag.

You can find more detailed reference documentation to all resource types in the Resources.

Sources

The sources section of your tools.yaml defines what data sources your Toolbox should have access to. Most tools will have at least one source to execute against.

sources:
  my-pg-source:
    kind: postgres
    host: 127.0.0.1
    port: 5432
    database: toolbox_db
    user: toolbox_user
    password: my-password

For more details on configuring different types of sources, see the Sources.

Tools

The tools section of a tools.yaml define the actions an agent can take: what kind of tool it is, which source(s) it affects, what parameters it uses, etc.

tools:
  search-hotels-by-name:
    kind: postgres-sql
    source: my-pg-source
    description: Search for hotels based on name.
    parameters:
      - name: name
        type: string
        description: The name of the hotel.
    statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';

For more details on configuring different types of tools, see the Tools.

Toolsets

The toolsets section of your tools.yaml allows you to define groups of tools that you want to be able to load together. This can be useful for defining different groups based on agent or application.

toolsets:
    my_first_toolset:
        - my_first_tool
        - my_second_tool
    my_second_toolset:
        - my_second_tool
        - my_third_tool

You can load toolsets by name:

# This will load all tools
all_tools = client.load_toolset()

# This will only load the tools listed in 'my_second_toolset'
my_second_toolset = client.load_toolset("my_second_toolset")

Versioning

This project uses semantic versioning, including a MAJOR.MINOR.PATCH version number that increments with:

  • MAJOR version when we make incompatible API changes
  • MINOR version when we add functionality in a backward compatible manner
  • PATCH version when we make backward compatible bug fixes

The public API that this applies to is the CLI associated with Toolbox, the interactions with official SDKs, and the definitions in the tools.yaml file.

Contributing

Contributions are welcome. Please, see the CONTRIBUTING to get started.

Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms. See Contributor Code of Conduct for more information.

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • https://reddgr.com
  • Delivers concise Python code and interprets non-English comments

  • 林乔安妮
  • A fashion stylist GPT offering outfit suggestions for various scenarios.

  • 田中 楓太
  • A virtual science instructor for engaging and informative lessons.

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • GeyserMC
  • 与Minecraft客户端/服务器通信的库。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • 1Panel-dev
  • 🔥1Panel提供了直观的Web接口和MCP服务器,用于在Linux服务器上管理网站,文件,容器,数据库和LLMS。

  • awslabs
  • AWS MCP服务器 - 将AWS最佳实践直接带入您的开发工作流程的专门MCP服务器

  • 1Panel-dev
  • 💬MAXKB是一种现成的AI聊天机器人,它集成了检索功能增强的生成(RAG)管道,支持强大的工作流程并提供高级的MCP工具使用功能。

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • appcypher
  • 很棒的MCP服务器 - 模型上下文协议服务器的策划列表

  • chongdashu
  • 使用模型上下文协议(MCP),启用Cursor,Windsurf和Claude Desktop等AI助手客户,以通过自然语言控制虚幻引擎。

    Reviews

    4 (1)
    Avatar
    user_Y0ycMiID
    2025-04-17

    I've been using genai-toolbox by googleapis and it's a game-changer for any AI project. The comprehensive tools and seamless integration have significantly boosted my productivity. Highly recommend checking it out on GitHub: https://github.com/googleapis/genai-toolbox.