Cover image

AI可观察性的OpenTelemetry仪器

3 years

Works with Finder

382

Github Watches

81

Github Forks

382

Github Stars

OpenInfence

OpenInference is a set of conventions and plugins that is complimentary to OpenTelemetry to enable tracing of AI applications. OpenInference is natively supported by arize-phoenix, but can be used with any OpenTelemetry-compatible backend as well.

Specification

The OpenInference specification is edited in markdown files found in the spec directory. It's designed to provide insight into the invocation of LLMs and the surrounding application context such as retrieval from vector stores and the usage of external tools such as search engines or APIs. The specification is transport and file-format agnostic, and is intended to be used in conjunction with other specifications such as JSON, ProtoBuf, and DataFrames.

Instrumentation

OpenInference provides a set of instrumentations for popular machine learning SDKs and frameworks in a variety of languages.

Python

Libraries

Package Description Version
openinference-semantic-conventions Semantic conventions for tracing of LLM Apps. PyPI Version
openinference-instrumentation-openai OpenInference Instrumentation for OpenAI SDK. PyPI Version
openinference-instrumentation-openai-agents OpenInference Instrumentation for OpenAI Agents SDK. PyPI Version
openinference-instrumentation-llama-index OpenInference Instrumentation for LlamaIndex. PyPI Version
openinference-instrumentation-dspy OpenInference Instrumentation for DSPy. PyPI Version
openinference-instrumentation-bedrock OpenInference Instrumentation for AWS Bedrock. PyPI Version
openinference-instrumentation-langchain OpenInference Instrumentation for LangChain. PyPI Version
openinference-instrumentation-mcp OpenInference Instrumentation for MCP. PyPI Version
openinference-instrumentation-mistralai OpenInference Instrumentation for MistralAI. PyPI Version
openinference-instrumentation-portkey OpenInference Instrumentation for Portkey. PyPI Version
openinference-instrumentation-guardrails OpenInference Instrumentation for Guardrails. PyPI Version
openinference-instrumentation-vertexai OpenInference Instrumentation for VertexAI. PyPI Version
openinference-instrumentation-crewai OpenInference Instrumentation for CrewAI. PyPI Version
openinference-instrumentation-haystack OpenInference Instrumentation for Haystack. PyPI Version
openinference-instrumentation-litellm OpenInference Instrumentation for liteLLM. PyPI Version
openinference-instrumentation-groq OpenInference Instrumentation for Groq. PyPI Version
openinference-instrumentation-instructor OpenInference Instrumentation for Instructor. PyPI Version
openinference-instrumentation-anthropic OpenInference Instrumentation for Anthropic. PyPI Version
openinference-instrumentation-beeai OpenInference Instrumentation for BeeAI. PyPI Version

Examples

Name Description Complexity Level
OpenAI SDK OpenAI Python SDK, including chat completions and embeddings Beginner
MistralAI SDK MistralAI Python SDK Beginner
VertexAI SDK VertexAI Python SDK Beginner
LlamaIndex LlamaIndex query engines Beginner
DSPy DSPy primitives and custom RAG modules Beginner
Boto3 Bedrock Client Boto3 Bedrock client Beginner
LangChain LangChain primitives and simple chains Beginner
LiteLLM A lightweight LiteLLM framework Beginner
LiteLLM Proxy LiteLLM Proxy to log OpenAI, Azure, Vertex, Bedrock Beginner
Groq Groq and AsyncGroq chat completions Beginner
Anthropic Anthropic Messages client Beginner
BeeAI Agentic instrumentation in the BeeAI framework Beginner
LlamaIndex + Next.js Chatbot A fully functional chatbot using Next.js and a LlamaIndex FastAPI backend Intermediate
LangServe A LangChain application deployed with LangServe using custom metadata on a per-request basis Intermediate
DSPy A DSPy RAG application using FastAPI, Weaviate, and Cohere Intermediate
Haystack A Haystack QA RAG application Intermediate
OpenAI Agents OpenAI Agents with handoffs Intermediate

JavaScript

Libraries

Package Description Version
@arizeai/openinference-semantic-conventions Semantic conventions for tracing of LLM Apps. NPM Version
@arizeai/openinference-core Core utility functions for instrumentation NPM Version
@arizeai/openinference-instrumentation-beeai OpenInference Instrumentation for BeeAI. NPM Version
@arizeai/openinference-instrumentation-langchain OpenInference Instrumentation for LangChain.js. NPM Version
@arizeai/openinference-instrumentation-mcp OpenInference Instrumentation for MCP. NPM Version
@arizeai/openinference-instrumentation-openai OpenInference Instrumentation for OpenAI SDK. NPM Version
@arizeai/openinference-vercel OpenInference Support for Vercel AI SDK NPM Version

Examples

Name Description Complexity Level
OpenAI SDK OpenAI Node.js client Beginner
BeeAI framework - ReAct agent Agentic ReActAgent instrumentation in the BeeAI framework Beginner
BeeAI framework - ToolCalling agent Agentic ToolCallingAgent instrumentation in the BeeAI framework Beginner
BeeAI framework - LLM See how to run instrumentation only for the specific LLM module part in the BeeAI framework Beginner
LlamaIndex Express App A fully functional LlamaIndex chatbot with a Next.js frontend and a LlamaIndex Express backend, instrumented using openinference-instrumentation-openai Intermediate
LangChain OpenAI A simple script to call OpenAI via LangChain, instrumented using openinference-instrumentation-langchain Beginner
LangChain RAG Express App A fully functional LangChain chatbot that uses RAG to answer user questions. It has a Next.js frontend and a LangChain Express backend, instrumented using openinference-instrumentation-langchain Intermediate
Next.js + OpenAI A Next.js 13 project bootstrapped with create-next-app that uses OpenAI to generate text Beginner

Supported Destinations

OpenInference supports the following destinations as span collectors.

Community

Join our community to connect with thousands of machine learning practitioners and LLM observability enthusiasts!

  • 🌍 Join our Slack community.
  • 💡 Ask questions and provide feedback in the #phoenix-support channel.
  • 🌟 Leave a star on our GitHub.
  • 🐞 Report bugs with GitHub Issues.
  • 𝕏 Follow us on X.
  • 🗺️ Check out our roadmap to see where we're heading next.

相关推荐

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • 1Panel-dev
  • 🔥1Panel提供了直观的Web接口和MCP服务器,用于在Linux服务器上管理网站,文件,容器,数据库和LLMS。

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • Byaidu
  • PDF科学纸翻译带有保留格式的pdf -基于ai完整保留排版的pdf文档全文双语翻译

  • sigoden
  • 使用普通的bash/javascript/python函数轻松创建LLM工具和代理。

  • hkr04
  • 轻巧的C ++ MCP(模型上下文协议)SDK

  • rulego
  • ⛓️Rulego是一种轻巧,高性能,嵌入式,下一代组件编排规则引擎框架。

  • RockChinQ
  • 😎简单易用、🧩丰富生态 -大模型原生即时通信机器人平台| 适配QQ / 微信(企业微信、个人微信) /飞书 /钉钉 / discord / telegram / slack等平台| 支持chatgpt,deepseek,dify,claude,基于LLM的即时消息机器人平台,支持Discord,Telegram,微信,Lark,Dingtalk,QQ,Slack

  • dmayboroda
  • 带有可配置容器的本地对话抹布

    Reviews

    3.4 (7)
    Avatar
    user_Sadh0gbJ
    2025-04-23

    As a dedicated user of the MCP application, I am thoroughly impressed with OpenInference by Arize-ai. This innovation streamlines the inference process, making it efficient and user-friendly. The seamless integration and intuitive design simplify complex tasks, providing valuable insights in no time. Highly recommend this product to anyone in the AI field!

    Avatar
    user_Zb3xLthD
    2025-04-23

    As a dedicated MCP application user, I highly recommend OpenInference by Arize-ai. This tool is exceptionally efficient and intuitive, making it perfect for anyone looking to streamline their data inference processes. The seamless integration and user-friendly interface significantly enhance productivity, ensuring accurate and timely results. OpenInference is indeed a game-changer in the world of data analytics!

    Avatar
    user_KRrbi3fR
    2025-04-23

    OpenInference by Arize-ai is a game-changer for anyone working in the AI and ML space. The product is robust, user-friendly, and integrates seamlessly with various data workflows. It has significantly improved my productivity and the accuracy of my models. I highly recommend it to professionals looking for a reliable inference tool.

    Avatar
    user_l8uHIpm1
    2025-04-23

    I've been using OpenInference by Arize-ai and it's been a game-changer for me. The platform's intuitive interface makes complex data analysis a breeze, and the real-time insights have significantly boosted my productivity. I highly recommend it to anyone looking to streamline their workflow and make data-driven decisions.

    Avatar
    user_2pX32xee
    2025-04-23

    As a dedicated user of MCP applications, I am thoroughly impressed with openinference by Arize-ai. Its intuitive design and robust features offer unparalleled insights and breakthroughs. This tool seamlessly integrates with my workflow, enhancing productivity and accuracy. Highly recommended for professionals seeking advanced inference capabilities.

    Avatar
    user_uKA5Jv2E
    2025-04-23

    OpenInference by Arize-ai is a game-changer in the realm of machine learning. Its intuitive design and powerful features make it a must-have tool for anyone serious about data analysis. The seamless integration and user-friendly interface save time and effort, allowing for more focus on deriving insights. Highly recommended!

    Avatar
    user_1WsK0Zah
    2025-04-23

    As a loyal user of various MCP applications, I found OpenInference by Arize-ai to be incredibly intuitive and robust. Its seamless integration with AI models and data analytics provided immense value in streamlining our inference processes. The user-friendly interface and impressive support made it a game-changer for our team. Highly recommended for anyone in the data science field!