I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

MCP-ollama-agent
一个打字稿示例,展示了Ollama与模型上下文协议(MCP)服务器的集成。该项目为AI代理提供了一个交互式命令行接口,该接口可以利用多个MCP服务器中的工具。
3 years
Works with Finder
1
Github Watches
5
Github Forks
14
Github Stars
TypeScript MCP Agent with Ollama Integration
This project demonstrates integration between Model Context Protocol (MCP) servers and Ollama, allowing AI models to interact with various tools through a unified interface.
✨ Features
- Supports multiple MCP servers (both uvx and npx tested)
- Built-in support for file system operations and web research
- Easy configuration through
mcp-config.json
similar toclaude_desktop_config.json
- Interactive chat interface with Ollama integration that should support any tools
- Standalone demo mode for testing web and filesystem tools without an LLM
🚀 Getting Started
-
Prerequisites:
-
Node.js (version 18 or higher)
-
Ollama installed and running
-
Install the MCP tools globally that you want to use:
# For filesystem operations npm install -g @modelcontextprotocol/server-filesystem # For web research npm install -g @mzxrai/mcp-webresearch
-
-
Clone and install:
git clone https://github.com/ausboss/mcp-ollama-agent.git cd mcp-ollama-agent npm install
-
Configure your tools and tool supported Ollama model in
mcp-config.json
:{ "mcpServers": { "filesystem": { "command": "npx", "args": ["@modelcontextprotocol/server-filesystem", "./"] }, "webresearch": { "command": "npx", "args": ["-y", "@mzxrai/mcp-webresearch"] } }, "ollama": { "host": "http://localhost:11434", "model": "qwen2.5:latest" } }
-
Run the demo to test filesystem and webresearch tools without an LLM:
npx tsx ./src/demo.ts
-
Or start the chat interface with Ollama:
npm start
⚙️ Configuration
-
MCP Servers: Add any MCP-compatible server to the
mcpServers
section - Ollama: Configure host and model (must support function calling)
- Supports both Python (uvx) and Node.js (npx) MCP servers
💡 Example Usage
This example used this model qwen2.5:latest
Chat started. Type "exit" to end the conversation.
You: can you use your list directory tool to see whats in test-directory then use your read file tool to read it to me?
Model is using tools to help answer...
Using tool: list_directory
With arguments: { path: 'test-directory' }
Tool result: [ { type: 'text', text: '[FILE] test.txt' } ]
Assistant:
Model is using tools to help answer...
Using tool: read_file
With arguments: { path: 'test-directory/test.txt' }
Tool result: [ { type: 'text', text: 'rosebud' } ]
Assistant: The content of the file `test.txt` in the `test-directory` is:
rosebud
You: thanks
Assistant: You're welcome! If you have any other requests or need further assistance, feel free to ask.
System Prompts
Some local models may need help with tool selection. Customize the system prompt in ChatManager.ts
to improve tool usage.
🤝 Contributing
Contributions welcome! Feel free to submit issues or pull requests.
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
Professional Flask/SQLAlchemy code guide. Follow: https://x.com/navid_re
Oede knorrepot die vasthoudt an de goeie ouwe tied van 't boerenleven
Reviews

user_92TZqk0j
I've been using the Cloud Storage MCP Server by gitskyflux for several months now, and I'm thoroughly impressed. The seamless integration and reliable performance make it a standout choice for cloud storage solutions. Highly recommended for businesses looking for robust and efficient storage options. Check it out at https://mcp.so/server/cloudstorage-mcp/gitskyflux!