I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

llm-name
Oh! The naming of the model providers like sh**.
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
LLMs Name
- https://www.shikei.me/ This is the main site of 10k. If you are interested, you can check it out.
- You can visit this: https://llm-name.vercel.app/
Project Introduction
As someone (a frontend developer) who used to face VSCode
every day, I now use Cursor
instead. GPT was first launched in November 2022. Then the upgraded version ChatGPT-4 was released in March 2023. Generally, the closer the date, the stronger the model's capabilities.
Understanding Timing
: In Model Timing
, it refers to the time point of the large language model's release, or the moment of excitement. In gaming, there's also the saying "catching the timing"; with an increasing number of models, version commands are becoming more complex. Based on this background, we are developing a website mainly to record the "moment of models" of large language models, primarily in the form of timestamps, showcasing how various companies update their large language models over time. We hope to "catch" the explosive moment of model releases. This will also help to understand the characteristics of each model better, enabling better application in actual work and life.
However, as the names of the models become more complex, we sometimes do not even know which model's version is stronger. This is the background of our website development.
Features
- 📊 Interactive timeline visualization of major LLM model releases
- 🔍 Detailed information about each model's capabilities and use cases
- 💡 Helpful tooltips providing context and usage recommendations
- 🌐 Cross-platform compatibility with responsive design
- 🔄 Regular updates as new models are released
- 🎨 Clean and intuitive user interface with beautiful particle animation
Tech Stack
- Framework: Next.js
- Styling: Tailwind CSS
- UI Components: Custom components with responsive design
- Animation: Custom particle system
- Deployment: Vercel
Getting Started
Prerequisites
- Node.js 18.x or later
- pnpm
Installation
# Clone the repository
git clone https://github.com/yayxs/llm-name.git
# Navigate to the project directory
cd llm-name
# Install dependencies
pnpm install
# Start the development server
pnpm run dev
Then open http://localhost:3000 in your browser to see the application.
How to Contribute
Contributions are welcome! If you'd like to contribute, please:
- Fork the repository
- Create a new branch (
git checkout -b feature/your-feature-name
) - Make your changes
- Commit your changes (
git commit -m 'Add some feature'
) - Push to the branch (
git push origin feature/your-feature-name
) - Open a Pull Request
Please make sure your code follows the existing style and includes appropriate tests.
License
MIT
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Advanced software engineer GPT that excels through nailing the basics.
Delivers concise Python code and interprets non-English comments
A unified API gateway for integrating multiple etherscan-like blockchain explorer APIs with Model Context Protocol (MCP) support for AI assistants.
Mirror ofhttps://github.com/suhail-ak-s/mcp-typesense-server
本项目是一个钉钉MCP(Message Connector Protocol)服务,提供了与钉钉企业应用交互的API接口。项目基于Go语言开发,支持员工信息查询和消息发送等功能。
Short and sweet example MCP server / client implementation for Tools, Resources and Prompts.
Reviews

user_OzWbJNbV
I recently started using llm-name and it's been fantastic! Developed by yayxs and available on GitHub, this tool has exceeded my expectations. Its features are user-friendly and incredibly efficient. If you're in need of a reliable application, check out llm-name!