Cover image
Try Now
8 小时前

N8N的MCP自定义节点

3 years

Works with Finder

22

Github Watches

242

Github Forks

1.2k

Github Stars

n8n-nodes-mcp-client

This is an n8n community node that lets you interact with Model Context Protocol (MCP) servers in your n8n workflows.

MCP is a protocol that enables AI models to interact with external tools and data sources in a standardized way. This node allows you to connect to MCP servers, access resources, execute tools, and use prompts.

n8n is a fair-code licensed workflow automation platform.

Installation Credentials Environment Variables Operations Using as a Tool Compatibility Resources

Getting Started

Official Quickstart Video:

MCP Client Node Quickstart

Community Videos

Shoutout to all the creators of the following n8n community videos that are great resources for learning how to use this node:

If you have a great video that you'd like to share, please let me know and I'll add it to the list!

Interested a deeper dive into MCP?

Check out my YouTube Series MCP Explained for more information about the Model Context Protocol.

Installation

Follow the installation guide in the n8n community nodes documentation.

Also pay attention to Environment Variables for using tools in AI Agents. It's mandatory to set the N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE environment variable to true if you want to use the MCP Client node as a tool in AI Agents.

Credentials

The MCP Client node supports two types of credentials to connect to an MCP server:

Command-line Based Transport (STDIO)

MCP Client STDIO Credentials

  • Command: The command to start the MCP server
  • Arguments: Optional arguments to pass to the server command
  • Environment Variables: Variables to pass to the server in NAME=VALUE format

Server-Sent Events (SSE) Transport

MCP Client SSE Credentials

  • SSE URL: The URL of the SSE endpoint (default: http://localhost:3001/sse)
  • Messages Post Endpoint: Optional custom endpoint for posting messages if different from the SSE URL
  • Additional Headers: Optional headers to send with requests (format: name:value, one per line)

Environment Variables

The MCP Client node supports passing environment variables to MCP servers using the command-line based transport in two ways:

1. Using the Credentials UI

You can add environment variables directly in the credentials configuration:

Environment Variables in Credentials

This method is useful for individual setups and testing. The values are stored securely as credentials in n8n.

2. Using Docker Environment Variables

For Docker deployments, you can pass environment variables directly to your MCP servers by prefixing them with MCP_:

version: '3'

services:
  n8n:
    image: n8nio/n8n
    environment:
      - MCP_BRAVE_API_KEY=your-api-key-here
      - MCP_OPENAI_API_KEY=your-openai-key-here
      - MCP_CUSTOM_SETTING=some-value
    # other configuration...

These environment variables will be automatically passed to your MCP servers when they are executed.

Example: Using Brave Search MCP Server

This example shows how to set up and use the Brave Search MCP server:

  1. Install the Brave Search MCP server:

    npm install -g @modelcontextprotocol/server-brave-search
    
  2. Configure MCP Client credentials:

    • Command: npx
    • Arguments: -y @modelcontextprotocol/server-brave-search
    • Environment Variables: BRAVE_API_KEY=your-api-key Add a variables (space comma or newline separated)
  3. Create a workflow that uses the MCP Client node:

    • Add an MCP Client node
    • Select the "List Tools" operation to see available search tools
    • Add another MCP Client node
    • Select the "Execute Tool" operation
    • Choose the "brave_search" tool
    • Set Parameters to: {"query": "latest AI news"}

Brave Search Example

The node will execute the search and return the results in the output.

Example: Multi-Server Setup with AI Agent

This example demonstrates how to set up multiple MCP servers in a production environment and use them with an AI agent:

  1. Configure your docker-compose.yml file:
version: '3'

services:
  n8n:
    image: n8nio/n8n
    environment:
      # MCP server environment variables
      - MCP_BRAVE_API_KEY=your-brave-api-key
      - MCP_OPENAI_API_KEY=your-openai-key
      - MCP_SERPER_API_KEY=your-serper-key
      - MCP_WEATHER_API_KEY=your-weather-api-key

      # Enable community nodes as tools
      - N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
    ports:
      - "5678:5678"
    volumes:
      - ~/.n8n:/home/node/.n8n
  1. Create multiple MCP Client credentials in n8n:

    Brave Search Credentials:

    • Command: npx
    • Arguments: -y @modelcontextprotocol/server-brave-search

    OpenAI Tools Credentials:

    • Command: npx
    • Arguments: -y @modelcontextprotocol/server-openai

    Web Search Credentials:

    • Command: npx
    • Arguments: -y @modelcontextprotocol/server-serper

    Weather API Credentials:

    • Command: npx
    • Arguments: -y @modelcontextprotocol/server-weather
  2. Create an AI Agent workflow:

    • Add an AI Agent node
    • Enable MCP Client as a tool
    • Configure different MCP Client nodes with different credentials
    • Create a prompt that uses multiple data sources

Multi-Server Setup

Example AI Agent prompt:

I need you to help me plan a trip. First, search for popular destinations in {destination_country}.
Then, check the current weather in the top 3 cities.
Finally, find some recent news about travel restrictions for these places.

With this setup, the AI agent can use multiple MCP tools across different servers, all using environment variables configured in your Docker deployment.

Example: Using a Local MCP Server with SSE

This example shows how to connect to a locally running MCP server using Server-Sent Events (SSE):

  1. Start a local MCP server that supports SSE:

    npx @modelcontextprotocol/server-example-sse
    

    Or run your own custom MCP server with SSE support on port 3001.

  2. Configure MCP Client credentials:

    • In the node settings, select Connection Type: Server-Sent Events (SSE)
    • Create new credentials of type MCP Client (SSE) API
    • Set SSE URL: http://localhost:3001/sse
    • Add any required headers if your server needs authentication
  3. Create a workflow that uses the MCP Client node:

    • Add an MCP Client node
    • Set the Connection Type to Server-Sent Events (SSE)
    • Select your SSE credentials
    • Select the "List Tools" operation to see available tools
    • Execute the workflow to see the results

SSE Example

This method is particularly useful when:

  • Your MCP server is running as a standalone service
  • You're connecting to a remote MCP server
  • Your server requires special authentication headers
  • You need to separate the transport channel from the message channel

Operations

The MCP Client node supports the following operations:

MCP Client Operations

  • Execute Tool - Execute a specific tool with parameters
  • Get Prompt - Get a specific prompt template
  • List Prompts - Get a list of available prompts
  • List Resources - Get a list of available resources from the MCP server
  • List Tools - Get a list of available tools
  • Read Resource - Read a specific resource by URI

Example: List Tools Operation

List Tools Example

The List Tools operation returns all available tools from the MCP server, including their names, descriptions, and parameter schemas.

Example: Execute Tool Operation

Execute Tool Example

The Execute Tool operation allows you to execute a specific tool with parameters. Make sure to select the tool you want to execute from the dropdown menu.

Using as a Tool

This node can be used as a tool in n8n AI Agents. To enable community nodes as tools, you need to set the N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE environment variable to true.

Setting the Environment Variable

If you're using a bash/zsh shell:

export N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
n8n start

If you're using Docker: Add to your docker-compose.yml file:

environment:
  - N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true

If you're using the desktop app: Create a .env file in the n8n directory:

N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true

If you want to set it permanently on Mac/Linux: Add to your ~/.zshrc or ~/.bash_profile:

export N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true

Example of an AI Agent workflow results:

AI Agent Example

After setting this environment variable and restarting n8n, your MCP Client node will be available as a tool in AI Agent nodes.

Compatibility

  • Requires n8n version 1.0.0 or later
  • Compatible with MCP Protocol version 1.0.0 or later
  • Supports both STDIO and SSE transports for connecting to MCP servers
  • SSE transport requires a server that implements the MCP Server-Sent Events specification

Resources

相关推荐

  • Benedikt Ess
  • FindetundanalysiertOnlineProdukteeinschlielichAmazonnachVolumenBewertungenundPreis

  • GreenPepper Consulting India Private Limited
  • Formal yet conversational interview questions

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • GeyserMC
  • 与Minecraft客户端/服务器通信的库。

  • 1Panel-dev
  • 🔥1Panel提供了直观的Web接口和MCP服务器,用于在Linux服务器上管理网站,文件,容器,数据库和LLMS。

  • awslabs
  • AWS MCP服务器 - 将AWS最佳实践直接带入您的开发工作流程的专门MCP服务器

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • GLips
  • MCP服务器向像光标这样的AI编码代理提供FIGMA布局信息

  • Byaidu
  • PDF科学纸翻译带有保留格式的pdf -基于ai完整保留排版的pdf文档全文双语翻译

  • n8n-io
  • 具有本机AI功能的公平代码工作流程自动化平台。将视觉构建与自定义代码,自宿主或云相结合,400+集成。

  • activepieces
  • AI代理和MCPS&AI工作流程自动化•(AI代理280+ MCP服务器)•AI Automation / MCPS的AI Automation / AI Agent•AI Workfrows&AI代理•AI代理的MCPS

  • opensumi
  • 框架可以帮助您快速构建AI Anation Ide产品。 MCP客户端,通过MCP服务器支持模型上下文协议(MCP)工具。

    Reviews

    2 (1)
    Avatar
    user_0bEkhCov
    2025-04-17

    I've been using n8n-nodes-mcp from nerding-io, and it's been a game-changer for my workflow automation. The integration is seamless, and it effortlessly handles complex processes. Highly recommend it for anyone looking to enhance their n8n setup. Check it out on GitHub!