
openinference
AI可观察性的OpenTelemetry仪器
3 years
Works with Finder
382
Github Watches
81
Github Forks
382
Github Stars
OpenInference is a set of conventions and plugins that is complimentary to OpenTelemetry to enable tracing of AI applications. OpenInference is natively supported by arize-phoenix, but can be used with any OpenTelemetry-compatible backend as well.
Specification
The OpenInference specification is edited in markdown files found in the spec directory. It's designed to provide insight into the invocation of LLMs and the surrounding application context such as retrieval from vector stores and the usage of external tools such as search engines or APIs. The specification is transport and file-format agnostic, and is intended to be used in conjunction with other specifications such as JSON, ProtoBuf, and DataFrames.
Instrumentation
OpenInference provides a set of instrumentations for popular machine learning SDKs and frameworks in a variety of languages.
Python
Libraries
Package | Description | Version |
---|---|---|
openinference-semantic-conventions |
Semantic conventions for tracing of LLM Apps. | |
openinference-instrumentation-openai |
OpenInference Instrumentation for OpenAI SDK. | |
openinference-instrumentation-openai-agents |
OpenInference Instrumentation for OpenAI Agents SDK. | |
openinference-instrumentation-llama-index |
OpenInference Instrumentation for LlamaIndex. | |
openinference-instrumentation-dspy |
OpenInference Instrumentation for DSPy. | |
openinference-instrumentation-bedrock |
OpenInference Instrumentation for AWS Bedrock. | |
openinference-instrumentation-langchain |
OpenInference Instrumentation for LangChain. | |
openinference-instrumentation-mcp |
OpenInference Instrumentation for MCP. | |
openinference-instrumentation-mistralai |
OpenInference Instrumentation for MistralAI. | |
openinference-instrumentation-portkey |
OpenInference Instrumentation for Portkey. | |
openinference-instrumentation-guardrails |
OpenInference Instrumentation for Guardrails. | |
openinference-instrumentation-vertexai |
OpenInference Instrumentation for VertexAI. | |
openinference-instrumentation-crewai |
OpenInference Instrumentation for CrewAI. | |
openinference-instrumentation-haystack |
OpenInference Instrumentation for Haystack. | |
openinference-instrumentation-litellm |
OpenInference Instrumentation for liteLLM. | |
openinference-instrumentation-groq |
OpenInference Instrumentation for Groq. | |
openinference-instrumentation-instructor |
OpenInference Instrumentation for Instructor. | |
openinference-instrumentation-anthropic |
OpenInference Instrumentation for Anthropic. | |
openinference-instrumentation-beeai |
OpenInference Instrumentation for BeeAI. |
Examples
Name | Description | Complexity Level |
---|---|---|
OpenAI SDK | OpenAI Python SDK, including chat completions and embeddings | Beginner |
MistralAI SDK | MistralAI Python SDK | Beginner |
VertexAI SDK | VertexAI Python SDK | Beginner |
LlamaIndex | LlamaIndex query engines | Beginner |
DSPy | DSPy primitives and custom RAG modules | Beginner |
Boto3 Bedrock Client | Boto3 Bedrock client | Beginner |
LangChain | LangChain primitives and simple chains | Beginner |
LiteLLM | A lightweight LiteLLM framework | Beginner |
LiteLLM Proxy | LiteLLM Proxy to log OpenAI, Azure, Vertex, Bedrock | Beginner |
Groq | Groq and AsyncGroq chat completions | Beginner |
Anthropic | Anthropic Messages client | Beginner |
BeeAI | Agentic instrumentation in the BeeAI framework | Beginner |
LlamaIndex + Next.js Chatbot | A fully functional chatbot using Next.js and a LlamaIndex FastAPI backend | Intermediate |
LangServe | A LangChain application deployed with LangServe using custom metadata on a per-request basis | Intermediate |
DSPy | A DSPy RAG application using FastAPI, Weaviate, and Cohere | Intermediate |
Haystack | A Haystack QA RAG application | Intermediate |
OpenAI Agents | OpenAI Agents with handoffs | Intermediate |
JavaScript
Libraries
Package | Description | Version |
---|---|---|
@arizeai/openinference-semantic-conventions |
Semantic conventions for tracing of LLM Apps. | |
@arizeai/openinference-core |
Core utility functions for instrumentation | |
@arizeai/openinference-instrumentation-beeai |
OpenInference Instrumentation for BeeAI. | |
@arizeai/openinference-instrumentation-langchain |
OpenInference Instrumentation for LangChain.js. | |
@arizeai/openinference-instrumentation-mcp |
OpenInference Instrumentation for MCP. | |
@arizeai/openinference-instrumentation-openai |
OpenInference Instrumentation for OpenAI SDK. | |
@arizeai/openinference-vercel |
OpenInference Support for Vercel AI SDK |
Examples
Name | Description | Complexity Level |
---|---|---|
OpenAI SDK | OpenAI Node.js client | Beginner |
BeeAI framework - ReAct agent | Agentic ReActAgent instrumentation in the BeeAI framework |
Beginner |
BeeAI framework - ToolCalling agent | Agentic ToolCallingAgent instrumentation in the BeeAI framework |
Beginner |
BeeAI framework - LLM | See how to run instrumentation only for the specific LLM module part in the BeeAI framework | Beginner |
LlamaIndex Express App | A fully functional LlamaIndex chatbot with a Next.js frontend and a LlamaIndex Express backend, instrumented using openinference-instrumentation-openai |
Intermediate |
LangChain OpenAI | A simple script to call OpenAI via LangChain, instrumented using openinference-instrumentation-langchain |
Beginner |
LangChain RAG Express App | A fully functional LangChain chatbot that uses RAG to answer user questions. It has a Next.js frontend and a LangChain Express backend, instrumented using openinference-instrumentation-langchain |
Intermediate |
Next.js + OpenAI | A Next.js 13 project bootstrapped with create-next-app that uses OpenAI to generate text |
Beginner |
Supported Destinations
OpenInference supports the following destinations as span collectors.
- ✅ Arize-Phoenix
- ✅ Arize
- ✅ Any OTEL-compatible collector
Community
Join our community to connect with thousands of machine learning practitioners and LLM observability enthusiasts!
- 🌍 Join our Slack community.
- 💡 Ask questions and provide feedback in the #phoenix-support channel.
- 🌟 Leave a star on our GitHub.
- 🐞 Report bugs with GitHub Issues.
- 𝕏 Follow us on X.
- 🗺️ Check out our roadmap to see where we're heading next.
相关推荐
😎简单易用、🧩丰富生态 -大模型原生即时通信机器人平台| 适配QQ / 微信(企业微信、个人微信) /飞书 /钉钉 / discord / telegram / slack等平台| 支持chatgpt,deepseek,dify,claude,基于LLM的即时消息机器人平台,支持Discord,Telegram,微信,Lark,Dingtalk,QQ,Slack
Reviews

user_Sadh0gbJ
As a dedicated user of the MCP application, I am thoroughly impressed with OpenInference by Arize-ai. This innovation streamlines the inference process, making it efficient and user-friendly. The seamless integration and intuitive design simplify complex tasks, providing valuable insights in no time. Highly recommend this product to anyone in the AI field!

user_Zb3xLthD
As a dedicated MCP application user, I highly recommend OpenInference by Arize-ai. This tool is exceptionally efficient and intuitive, making it perfect for anyone looking to streamline their data inference processes. The seamless integration and user-friendly interface significantly enhance productivity, ensuring accurate and timely results. OpenInference is indeed a game-changer in the world of data analytics!

user_KRrbi3fR
OpenInference by Arize-ai is a game-changer for anyone working in the AI and ML space. The product is robust, user-friendly, and integrates seamlessly with various data workflows. It has significantly improved my productivity and the accuracy of my models. I highly recommend it to professionals looking for a reliable inference tool.

user_l8uHIpm1
I've been using OpenInference by Arize-ai and it's been a game-changer for me. The platform's intuitive interface makes complex data analysis a breeze, and the real-time insights have significantly boosted my productivity. I highly recommend it to anyone looking to streamline their workflow and make data-driven decisions.

user_2pX32xee
As a dedicated user of MCP applications, I am thoroughly impressed with openinference by Arize-ai. Its intuitive design and robust features offer unparalleled insights and breakthroughs. This tool seamlessly integrates with my workflow, enhancing productivity and accuracy. Highly recommended for professionals seeking advanced inference capabilities.

user_uKA5Jv2E
OpenInference by Arize-ai is a game-changer in the realm of machine learning. Its intuitive design and powerful features make it a must-have tool for anyone serious about data analysis. The seamless integration and user-friendly interface save time and effort, allowing for more focus on deriving insights. Highly recommended!

user_1WsK0Zah
As a loyal user of various MCP applications, I found OpenInference by Arize-ai to be incredibly intuitive and robust. Its seamless integration with AI models and data analytics provided immense value in streamlining our inference processes. The user-friendly interface and impressive support made it a game-changer for our team. Highly recommended for anyone in the data science field!