
MCP-Transformers
LangChain wrapper for seamless integration MCP-servers with different open-source large language models from transformers library.
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
MCP-OpenLLM
LangChain wrapper for seamless integration with different MCP-servers and open-source large language models (LLMs). You can use LangChain communities models too.
Roadmap
- Implement LangChain wrapper for Huggingface models
- Set transformer model name and type as params
- Test CloudFare Remote MCP server
Repo was inspired by articles:
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Advanced software engineer GPT that excels through nailing the basics.
Delivers concise Python code and interprets non-English comments
💬 MaxKB is a ready-to-use AI chatbot that integrates Retrieval-Augmented Generation (RAG) pipelines, supports robust workflows, and provides advanced MCP tool-use capabilities.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
MCP server to provide Figma layout information to AI coding agents like Cursor
Reviews

user_QQra5z8t
As a dedicated user of MCP-Transformers by getStRiCtd, I can confidently say that this application is a game-changer for deploying large language models efficiently. Its seamless integration and user-friendly interface make it incredibly convenient to work with sophisticated models. Highly recommended for developers looking to streamline their AI workflows!