I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

n8n_MCP_server_complete
Complete MCP server for n8n workflow management in Cursor
3 years
Works with Finder
1
Github Watches
3
Github Forks
6
Github Stars
n8n MCP Server
A Model Context Protocol (MCP) server that enables seamless management of n8n workflows directly within LLMs and AI agents through the Smithery Model Context Protocol.
Features
- List available workflows from n8n
- View workflow details
- Execute workflows
- Monitor workflow executions
- Pass parameters to workflows
- MCP-compatible interface for AI agents
Getting Started
Quick Start
-
Install the package
npm install @dopehunter/n8n-mcp-server
-
Create a .env file
cp .env.example .env
-
Configure your n8n connection Edit the
.env
file and set:-
N8N_BASE_URL
: URL to your n8n instance (e.g.,http://localhost:5678/api
) -
N8N_API_KEY
: Your n8n API key (generate this in n8n settings)
-
-
Start the server
npm start
-
Test the server
curl -X POST http://localhost:3000/mcp -H "Content-Type: application/json" \ -d '{"jsonrpc":"2.0","id":"1","method":"mcp.tools.list","params":{}}'
Common Issues and Troubleshooting
- Connection Refused Errors: Make sure your n8n instance is running and accessible at the URL specified in N8N_BASE_URL
- API Key Issues: Verify your n8n API key is correct and has appropriate permissions
- Docker Issues: Ensure Docker is running before attempting to build or run the Docker image
For more detailed troubleshooting, see the Troubleshooting Guide.
Components
Tools
-
n8n_list_workflows
- List all workflows in the n8n instance
- Input: None
-
n8n_get_workflow
- Get details of a specific workflow
- Input:
workflowId
(string, required): ID of the workflow to retrieve
-
n8n_execute_workflow
- Execute an n8n workflow
- Inputs:
-
workflowId
(string, required): ID of the workflow to execute -
data
(object, optional): Data to pass to the workflow
-
-
n8n_get_executions
- Get execution history for a workflow
- Inputs:
-
workflowId
(string, required): ID of the workflow to get executions for -
limit
(number, optional): Maximum number of executions to return
-
-
n8n_activate_workflow
- Activate a workflow
- Input:
workflowId
(string, required): ID of the workflow to activate
-
n8n_deactivate_workflow
- Deactivate a workflow
- Input:
workflowId
(string, required): ID of the workflow to deactivate
Prerequisites
- Node.js (v14+)
- n8n instance with API access
- An LLM or AI agent that supports the Model Context Protocol
Configuration Options
Docker Configuration
{
"mcpServers": {
"n8n": {
"command": "docker",
"args": ["run", "-i", "--rm", "--init", "-e", "N8N_API_KEY=$N8N_API_KEY", "-e", "N8N_BASE_URL=$N8N_BASE_URL", "mcp/n8n-mcp-server"]
}
}
}
NPX Configuration
{
"mcpServers": {
"n8n": {
"command": "npx",
"args": ["-y", "@dopehunter/n8n-mcp-server"]
}
}
}
Installation
NPM
npm install @dopehunter/n8n-mcp-server
Direct Usage with npx
npx @dopehunter/n8n-mcp-server
From Source
git clone https://github.com/dopehunter/n8n_MCP_server_complete.git
cd n8n_MCP_server_complete
npm install
cp .env.example .env
# Edit the .env file with your n8n API details
Development
Start the development server:
npm run start:dev
Build the project:
npm run build
Run tests:
npm test
Usage With Claude or Other LLMs
-
Start the MCP server:
npm start
-
Configure your LLM client to use the MCP server:
- For Claude Desktop, use the configuration from the "Configuration Options" section.
- For other clients, point to the server URL (e.g.,
http://localhost:3000/mcp
).
-
Your LLM can now use n8n workflows directly through MCP commands.
Building Docker Image
docker build -t mcp/n8n-mcp-server .
API Documentation
See the API Documentation for details on the available MCP functions.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the ISC License.
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Creates enchanting stories with a whimsical, child-friendly tone.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
A unified API gateway for integrating multiple etherscan-like blockchain explorer APIs with Model Context Protocol (MCP) support for AI assistants.
Mirror ofhttps://github.com/suhail-ak-s/mcp-typesense-server
本项目是一个钉钉MCP(Message Connector Protocol)服务,提供了与钉钉企业应用交互的API接口。项目基于Go语言开发,支持员工信息查询和消息发送等功能。
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Short and sweet example MCP server / client implementation for Tools, Resources and Prompts.
Reviews

user_xhAyh6sZ
As a dedicated user of the n8n_MCP_server_complete, I am truly impressed by its functionality and ease of use. Developed by the talented dopehunter, this server is extremely reliable and integrates smoothly with my projects. The detailed documentation and comprehensive features make automation tasks effortless. Highly recommend checking it out at https://github.com/dopehunter/n8n_MCP_server_complete!