
mastra
The TypeScript AI agent framework. ⚡ Assistants, RAG, observability. Supports any LLM: GPT-4, Claude, Gemini, Llama.
3 years
Works with Finder
49
Github Watches
618
Github Forks
12k
Github Stars
Mastra
Mastra is an opinionated TypeScript framework that helps you build AI applications and features quickly. It gives you the set of primitives you need: workflows, agents, RAG, integrations and evals. You can run Mastra on your local machine, or deploy to a serverless cloud.
The main Mastra features are:
Features | Description |
---|---|
LLM Models | Mastra uses the Vercel AI SDK for model routing, providing a unified interface to interact with any LLM provider including OpenAI, Anthropic, and Google Gemini. You can choose the specific model and provider, and decide whether to stream the response. |
Agents | Agents are systems where the language model chooses a sequence of actions. In Mastra, agents provide LLM models with tools, workflows, and synced data. Agents can call your own functions or APIs of third-party integrations and access knowledge bases you build. |
Tools | Tools are typed functions that can be executed by agents or workflows, with built-in integration access and parameter validation. Each tool has a schema that defines its inputs, an executor function that implements its logic, and access to configured integrations. |
Workflows | Workflows are durable graph-based state machines. They have loops, branching, wait for human input, embed other workflows, do error handling, retries, parsing and so on. They can be built in code or with a visual editor. Each step in a workflow has built-in OpenTelemetry tracing. |
RAG | Retrieval-augemented generation (RAG) lets you construct a knowledge base for agents. RAG is an ETL pipeline with specific querying techniques, including chunking, embedding, and vector search. |
Integrations | In Mastra, integrations are auto-generated, type-safe API clients for third-party services that can be used as tools for agents or steps in workflows. |
Evals | Evals are automated tests that evaluate LLM outputs using model-graded, rule-based, and statistical methods. Each eval returns a normalized score between 0-1 that can be logged and compared. Evals can be customized with your own prompts and scoring functions. |
Quick Start
Prerequisites
- Node.js (v20.0+)
Get an LLM provider API key
If you don't have an API key for an LLM provider, you can get one from the following services:
If you don't have an account with these providers, you can sign up and get an API key. Anthropic require a credit card to get an API key. Some OpenAI models and Gemini do not and have a generous free tier for its API.
Create a new project
The easiest way to get started with Mastra is by using create-mastra
. This CLI tool enables you to quickly start building a new Mastra application, with everything set up for you.
npx create-mastra@latest
Run the script
Finally, run mastra dev
to open the Mastra playground.
npm run dev
If you're using Anthropic, set the ANTHROPIC_API_KEY
. If you're using Gemini, set the GOOGLE_GENERATIVE_AI_API_KEY
.
Contributing
Looking to contribute? All types of help are appreciated, from coding to testing and feature specification.
If you are a developer and would like to contribute with code, please open an issue to discuss before opening a Pull Request.
Information about the project setup can be found in the development documentation
Support
We have an open community Discord. Come and say hello and let us know if you have any questions or need any help getting things running.
It's also super helpful if you leave the project a star here at the top of the page
相关推荐
I find academic articles and books for research and literature reviews.
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
💬 MaxKB is a ready-to-use AI chatbot that integrates Retrieval-Augmented Generation (RAG) pipelines, supports robust workflows, and provides advanced MCP tool-use capabilities.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Python code to use the MCP3008 analog to digital converter with a Raspberry Pi or BeagleBone black.
MCP server to provide Figma layout information to AI coding agents like Cursor
Reviews

user_XvtCMZFo
I have been using Mastra for a while now, and it has transformed how I approach machine learning projects. It is well-documented and user-friendly, making complex processes much more manageable. The GitHub repository is regularly updated, ensuring I have access to the latest features and improvements. Highly recommended for both beginners and advanced users! Check it out via their GitHub page.