Cover image
Try Now
2025-04-15

AI电子邮件的个人助理。开源应用程序可帮助您快速到达收件箱。

3 years

Works with Finder

16

Github Watches

588

Github Forks

5.5k

Github Stars

Inbox Zero - Your AI Email Assistant

Open source email app to reach inbox zero fast.
Website · Discord · Issues

About

There are two parts to Inbox Zero:

  1. An AI email assistant that helps you spend less time on email.
  2. Open source AI email client.

If you're looking to contribue to the project, the email client is the best place to do this.

Deploy with Vercel

Thanks to Vercel for sponsoring Inbox Zero in support of open-source software.

Features

  • AI Personal Assistant: Manages your email for you based on a plain text prompt file. It can take any action a human assistant can take on your behalf (Draft reply, Label, Archive, Reply, Forward, Mark Spam, and even call a webhook).
  • Reply Zero: Track emails that need your reply and those awaiting responses.
  • Smart Categories: Categorize everyone that's ever emailed you.
  • Bulk Unsubscriber: Quickly unsubscribe from emails you never read in one-click.
  • Cold Email Blocker: Automatically block cold emails.
  • Email Analytics: Track your email activity with daily, weekly, and monthly stats.

Learn more in our docs.

Feature Screenshots

AI Assistant Reply Zero
AI Assistant Reply Zero
Gmail Client Bulk Unsubscriber
Gmail client Bulk Unsubscriber

Demo Video

Inbox Zero demo

Built with

Feature Requests

To request a feature open a GitHub issue. If you don't have a GitHub account you can request features here. Or join our Discord.

Getting Started for Developers

We offer a hosted version of Inbox Zero at https://getinboxzero.com. To self-host follow the steps below.

Contributing to the project

You can view open tasks in our GitHub Issues. Join our Discord to discuss tasks and check what's being worked on.

ARCHITECTURE.md explains the architecture of the project (LLM generated).

Requirements

Setup

Here's a video on how to set up the project. It covers the same steps mentioned in this document. But goes into greater detail on setting up the external services.

The external services that are required are:

You also need to set an LLM, but you can use a local one too:

  • Anthropic
  • OpenAI
  • AWS Bedrock Anthropic
  • Google Gemini
  • Groq Llama 3.3 70B
  • Ollama (local)

We use Postgres for the database. For Redis, you can use Upstash Redis or set up your own Redis instance.

You can run Postgres & Redis locally using docker-compose

docker-compose up -d # -d will run the services in the background

Create your own .env file:

cp apps/web/.env.example apps/web/.env
cd apps/web
pnpm install

Set the environment variables in the newly created .env. You can see a list of required variables in: apps/web/env.ts.

The required environment variables:

  • NEXTAUTH_SECRET -- can be any random string (try using openssl rand -hex 32 for a quick secure random string)
  • GOOGLE_CLIENT_ID -- Google OAuth client ID. More info here
  • GOOGLE_CLIENT_SECRET -- Google OAuth client secret. More info here
  • GOOGLE_ENCRYPT_SECRET -- Secret key for encrypting OAuth tokens (try using openssl rand -hex 32 for a secure key)
  • GOOGLE_ENCRYPT_SALT -- Salt for encrypting OAuth tokens (try using openssl rand -hex 16 for a secure salt)
  • UPSTASH_REDIS_URL -- Redis URL from Upstash. (can be empty if you are using Docker Compose)
  • UPSTASH_REDIS_TOKEN -- Redis token from Upstash. (or specify your own random string if you are using Docker Compose)

When using Vercel with Fluid Compute turned off, you should set MAX_DURATION=300 or lower. See Vercel limits for different plans here.

To run the migrations:

pnpm prisma migrate dev

To run the app locally for development (slower):

pnpm run dev

Or from the project root:

turbo dev

To build and run the app locally in production mode (faster):

pnpm run build
pnpm start

Open http://localhost:3000 to view the app in your browser.

To upgrade yourself, make yourself an admin in the .env: ADMINS=hello@gmail.com Then upgrade yourself at: http://localhost:3000/admin.

Supported LLMs

For the LLM, you can use Anthropic, OpenAI, or Anthropic on AWS Bedrock. You can also use Ollama by setting the following enviroment variables:

OLLAMA_BASE_URL=http://localhost:11434/api
NEXT_PUBLIC_OLLAMA_MODEL=phi3

Note: If you need to access Ollama hosted locally and the application is running on Docker setup, you can use http://host.docker.internal:11434/api as the base URL. You might also need to set OLLAMA_HOST to 0.0.0.0 in the Ollama configuration file.

You can select the model you wish to use in the app on the /settings page of the app.

Setting up Google OAuth and Gmail API

You need to enable these scopes in the Google Cloud Console:

https://www.googleapis.com/auth/userinfo.profile
https://www.googleapis.com/auth/userinfo.email
https://www.googleapis.com/auth/gmail.modify
https://www.googleapis.com/auth/gmail.settings.basic
https://www.googleapis.com/auth/contacts

Set up push notifications via Google PubSub to handle emails in real time

Follow instructions here.

  1. Create a topic
  2. Create a subscription
  3. Grant publish rights on your topic

Set env var GOOGLE_PUBSUB_TOPIC_NAME. When creating the subscription select Push and the url should look something like: https://www.getinboxzero.com/api/google/webhook?token=TOKEN or https://abc.ngrok-free.app/api/google/webhook?token=TOKEN where the domain is your domain. Set GOOGLE_PUBSUB_VERIFICATION_TOKEN in your .env file to be the value of TOKEN.

To run in development ngrok can be helpful:

ngrok http 3000
# or with an ngrok domain to keep your endpoint stable (set `XYZ`):
ngrok http --domain=XYZ.ngrok-free.app 3000

And then update the webhook endpoint in the Google PubSub subscriptions dashboard.

To start watching emails visit: /api/google/watch/all

Watching for email updates

Set a cron job to run these: The Google watch is necessary. The Resend one is optional.

  "crons": [
    {
      "path": "/api/google/watch/all",
      "schedule": "0 1 * * *"
    },
    {
      "path": "/api/resend/summary/all",
      "schedule": "0 16 * * 1"
    }
  ]

Here are some easy ways to run cron jobs. Upstash is a free, easy option. I could never get the Vercel vercel.json. Open to PRs if you find a fix for that.

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Khalid kalib
  • Write professional emails

  • https://tovuti.be
  • Oede knorrepot die vasthoudt an de goeie ouwe tied van 't boerenleven

  • ANGEL LEON
  • A world class elite tech co-founder entrepreneur, expert in software development, entrepreneurship, marketing, coaching style leadership and aligned with ambition for excellence, global market penetration and worldy perspectives.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Gil kaminski
  • Make sure you are post-ready before you post on social media

  • Yasir Eryilmaz
  • AI scriptwriting assistant for short, engaging video content.

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • OffchainLabs
  • 进行以太坊的实施

  • huahuayu
  • 统一的API网关,用于将多个Etherscan样区块链Explorer API与对AI助手的模型上下文协议(MCP)支持。

  • deemkeen
  • 用电源组合控制您的MBOT2:MQTT+MCP+LLM

  • zhaoyunxing92
  • MCP(消息连接器协议)服务

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

    Reviews

    1 (1)
    Avatar
    user_8dy2MznL
    2025-04-15

    Weather MCP Server by matrupriya2048 is an excellent solution for real-time weather information. The seamless integration and accuracy of the data make it a must-have for any weather-related application. It's incredibly user-friendly and provides robust support. Highly recommend checking it out!