
lupuletic_onyx-mcp-server
Mirror ofhttps://github.com/lupuletic/onyx-mcp-server
3 years
Works with Finder
0
Github Watches
0
Github Forks
0
Github Stars
Onyx MCP Server
A Model Context Protocol (MCP) server for seamless integration with Onyx AI knowledge bases.
This MCP server connects any MCP-compatible client to your Onyx knowledge base, allowing you to search and retrieve relevant context from your documents. It provides a bridge between MCP clients and the Onyx API, enabling powerful semantic search and chat capabilities.
Features
- Enhanced Search: Semantic search across your Onyx document sets with LLM relevance filtering
- Context Window Retrieval: Retrieve chunks above and below the matching chunk for better context
- Full Document Retrieval: Option to retrieve entire documents instead of just chunks
- Chat Integration: Use Onyx's powerful chat API with LLM + RAG for comprehensive answers
- Configurable Document Set Filtering: Target specific document sets for more relevant results
Installation
Installing via Smithery
To install Onyx MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @lupuletic/onyx-mcp-server --client claude
Prerequisites
- Node.js (v16 or higher)
- An Onyx instance with API access
- An Onyx API token
Setup
-
Clone the repository:
git clone https://github.com/lupuletic/onyx-mcp-server.git cd onyx-mcp-server
-
Install dependencies:
npm install
-
Build the server:
npm run build
-
Configure your Onyx API Token:
export ONYX_API_TOKEN="your-api-token-here" export ONYX_API_URL="http://localhost:8080/api" # Adjust as needed
-
Start the server:
npm start
Configuring MCP Clients
For Claude Desktop App
Add to ~/Library/Application Support/Claude/claude_desktop_config.json
:
{
"mcpServers": {
"onyx-search": {
"command": "node",
"args": ["/path/to/onyx-mcp-server/build/index.js"],
"env": {
"ONYX_API_TOKEN": "your-api-token-here",
"ONYX_API_URL": "http://localhost:8080/api"
},
"disabled": false,
"alwaysAllow": []
}
}
}
For Claude in VSCode (Cline)
Add to your Cline MCP settings file:
{
"mcpServers": {
"onyx-search": {
"command": "node",
"args": ["/path/to/onyx-mcp-server/build/index.js"],
"env": {
"ONYX_API_TOKEN": "your-api-token-here",
"ONYX_API_URL": "http://localhost:8080/api"
},
"disabled": false,
"alwaysAllow": []
}
}
}
For Other MCP Clients
Consult your MCP client's documentation for how to add a custom MCP server. You'll need to provide:
- The command to run the server (
node
) - The path to the built server file (
/path/to/onyx-mcp-server/build/index.js
) - Environment variables for
ONYX_API_TOKEN
andONYX_API_URL
Available Tools
Once configured, your MCP client will have access to two powerful tools:
1. Search Tool
The search_onyx
tool provides direct access to Onyx's search capabilities with enhanced context retrieval:
<use_mcp_tool>
<server_name>onyx-search</server_name>
<tool_name>search_onyx</tool_name>
<arguments>
{
"query": "customer onboarding process",
"documentSets": ["Company Policies", "Training Materials"],
"maxResults": 3,
"chunksAbove": 1,
"chunksBelow": 1,
"retrieveFullDocuments": true
}
</arguments>
</use_mcp_tool>
Parameters:
-
query
(required): The topic to search for -
documentSets
(optional): List of document set names to search within (empty for all) -
maxResults
(optional): Maximum number of results to return (default: 5, max: 10) -
chunksAbove
(optional): Number of chunks to include above the matching chunk (default: 1) -
chunksBelow
(optional): Number of chunks to include below the matching chunk (default: 1) -
retrieveFullDocuments
(optional): Whether to retrieve full documents instead of just chunks (default: false)
2. Chat Tool
The chat_with_onyx
tool leverages Onyx's powerful chat API with LLM + RAG for comprehensive answers:
<use_mcp_tool>
<server_name>onyx-search</server_name>
<tool_name>chat_with_onyx</tool_name>
<arguments>
{
"query": "What is our company's policy on remote work?",
"personaId": 15,
"documentSets": ["Company Policies", "HR Documents"],
"chatSessionId": "optional-existing-session-id"
}
</arguments>
</use_mcp_tool>
Parameters:
-
query
(required): The question to ask Onyx -
personaId
(optional): The ID of the persona to use (default: 15) -
documentSets
(optional): List of document set names to search within (empty for all) -
chatSessionId
(optional): Existing chat session ID to continue a conversation
Chat Sessions
The chat tool supports maintaining conversation context across multiple interactions. After the first call, the response will include a chat_session_id
in the metadata. You can pass this ID in subsequent calls to maintain context.
Choosing Between Search and Chat
- Use Search When: You need specific, targeted information from documents and want to control exactly how much context is retrieved.
- Use Chat When: You need comprehensive answers that combine information from multiple sources, or when you want the LLM to synthesize information for you.
For the best results, you can use both tools in combination - search for specific details and chat for comprehensive understanding.
Use Cases
- Knowledge Management: Access your organization's knowledge base through any MCP-compatible interface
- Customer Support: Help support agents quickly find relevant information
- Research: Conduct deep research across your organization's documents
- Training: Provide access to training materials and documentation
- Policy Compliance: Ensure teams have access to the latest policies and procedures
Development
Running in Development Mode
npm run dev
Committing Changes
This project enforces the Conventional Commits specification for all commit messages. To make this easier, we provide an interactive commit tool:
npm run commit
This will guide you through creating a properly formatted commit message. Alternatively, you can write your own commit messages following the conventional format:
<type>[optional scope]: <description>
[optional body]
[optional footer(s)]
Where type
is one of: feat, fix, docs, style, refactor, perf, test, build, ci, chore, revert
Building for Production
npm run build
Testing
Run the test suite:
npm test
Run tests with coverage:
npm run test:coverage
Linting
npm run lint
Fix linting issues:
npm run lint:fix
Continuous Integration
This project uses GitHub Actions for continuous integration and deployment. The CI pipeline runs on every push to the main branch and on pull requests. It performs the following checks:
- Linting
- Building
- Testing
- Code coverage reporting
Automated Version Bumping and Publishing
When a PR is merged to the main branch, the project automatically determines the appropriate version bump type and publishes to npm. The system analyzes both PR titles and commit messages to determine the version bump type.
-
PR Title Validation: All PR titles are validated against the Conventional Commits specification:
- PR titles must start with a type (e.g.,
feat:
,fix:
,docs:
) - This validation happens automatically when a PR is created or updated
- PRs with invalid titles will fail the validation check
- PR titles must start with a type (e.g.,
-
Commit Message Validation: All commit messages are also validated against the conventional commits format:
- Commit messages must start with a type (e.g.,
feat:
,fix:
,docs:
) - This is enforced by git hooks that run when you commit
- Commits with invalid messages will be rejected
- Use
npm run commit
for an interactive commit message creation tool
- Commit messages must start with a type (e.g.,
-
Version Bump Determination: The system analyzes both the PR title and commit messages to determine the appropriate version bump:
- PR titles starting with
feat
or containing new features → minor version bump - PR titles starting with
fix
or containing bug fixes → patch version bump - PR titles containing
BREAKING CHANGE
or with an exclamation mark → major version bump - If the PR title doesn't indicate a specific bump type, the system analyzes commit messages
- The highest priority bump type found in any commit message is used (major > minor > patch)
- If no conventional commit prefixes are found, the system automatically defaults to a patch version bump without failing
- PR titles starting with
-
Version Update and Publishing:
- Bumps the version in package.json according to semantic versioning
- Commits and pushes the version change
- Publishes the new version to npm
This automated process ensures consistent versioning based on the nature of the changes, following semantic versioning principles, and eliminates manual version management.
Contributing
Contributions are welcome! Please see our Contributing Guide for more details.
Security
If you discover a security vulnerability, please follow our Security Policy.
License
This project is licensed under the MIT License - see the LICENSE file for details.
相关推荐
I find academic articles and books for research and literature reviews.
Converts Figma frames into front-end code for various mobile frameworks.
Confidential guide on numerology and astrology, based of GG33 Public information
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Advanced software engineer GPT that excels through nailing the basics.
Delivers concise Python code and interprets non-English comments
💬 MaxKB is a ready-to-use AI chatbot that integrates Retrieval-Augmented Generation (RAG) pipelines, supports robust workflows, and provides advanced MCP tool-use capabilities.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
MCP server to provide Figma layout information to AI coding agents like Cursor
Reviews

user_ZK4jd4rN
I'm thoroughly impressed with the lupuletic_onyx-mcp-server from MCP-Mirror. It's highly efficient and brings great stability to my applications. The seamless integration and robust performance are just phenomenal. For anyone seeking a reliable MCP server, this is undoubtedly a solid choice. Highly recommended!