Phoenix | MCP Market

Phoenix

Phoenix — это open-source платформа для наблюдаемости ИИ. Позволяет отслеживать, оценивать и оптимизировать приложения на основе LLM. Поддерживает разные фреймворки и провайдеров.

phoenix banner

Phoenix is an open-source AI observability platform designed for experimentation, evaluation, and troubleshooting. It provides:

  • Tracing - Trace your LLM application's runtime using OpenTelemetry-based instrumentation.
  • Evaluation - Leverage LLMs to benchmark your application's performance using response and retrieval evals.
  • Datasets - Create versioned datasets of examples for experimentation, evaluation, and fine-tuning.
  • Experiments - Track and evaluate changes to prompts, LLMs, and retrieval.
  • Playground- Optimize prompts, compare models, adjust parameters, and replay traced LLM calls.
  • Prompt Management- Manage and test prompt changes systematically using version control, tagging, and experimentation.

Phoenix is vendor and language agnostic with out-of-the-box support for popular frameworks (🦙LlamaIndex, 🦜⛓LangChain, Haystack, 🧩DSPy, 🤗smolagents) and LLM providers (OpenAI, Bedrock, MistralAI, VertexAI, LiteLLM, Google GenAI and more). For details on auto-instrumentation, check out the OpenInference project.

Phoenix runs practically anywhere, including your local machine, a Jupyter notebook, a containerized deployment, or in the cloud.

Installation

Install Phoenix via pip or conda

pip install arize-phoenix

Phoenix container images are available via Docker Hub and can be deployed using Docker or Kubernetes.

Packages

The arize-phoenix package includes the entire Phoenix platfom. However if you have deployed the Phoenix platform, there are light-weight Python sub-packages and TypeScript packages that can be used in conjunction with the platfrom.

Subpackages

| Package | Language | Description | | --------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------- | | arize-phoenix-otel | Python PyPI Version | Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults | | arize-phoenix-client | Python PyPI Version | Lightweight client for interacting with the Phoenix server via its OpenAPI REST interface | | arize-phoenix-evals | Python PyPI Version | Tooling to evaluate LLM applications including RAG relevance, answer relevance, and more | | @arizeai/phoenix-client | JavaScript NPM Version | Client for the Arize Phoenix API | | @arizeai/phoenix-mcp | JavaScript NPM Version | MCP server implementation for Arize Phoenix providing unified interface to Phoenix's capabilities |

Tracing Integrations

Phoenix is built on top of OpenTelemetry and is vendor, language, and framework agnostic. For details about tracing integrations and example applications, see the OpenInference project.

Python Integrations | Integration | Package | Version Badge | |------------------|-----------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------| | OpenAI | openinference-instrumentation-openai | PyPI Version | | OpenAI Agents | openinference-instrumentation-openai-agents | PyPI Version | | LlamaIndex | openinference-instrumentation-llama-index | PyPI Version | | DSPy | openinference-instrumentation-dspy | PyPI Version | | AWS Bedrock | openinference-instrumentation-bedrock | PyPI Version | | LangChain | openinference-instrumentation-langchain | PyPI Version | | MistralAI | openinference-instrumentation-mistralai | PyPI Version | | Google GenAI | openinference-instrumentation-google-genai | PyPI Version | | Google ADK | openinference-instrumentation-google-adk | PyPI Version | | Guardrails | openinference-instrumentation-guardrails | PyPI Version | | VertexAI | openinference-instrumentation-vertexai | PyPI Version | | CrewAI | openinference-instrumentation-crewai | PyPI Version | | Haystack | openinference-instrumentation-haystack | PyPI Version | | LiteLLM | openinference-instrumentation-litellm | PyPI Version | | Groq | openinference-instrumentation-groq | PyPI Version | | Instructor | openinference-instrumentation-instructor | PyPI Version | | Anthropic | openinference-instrumentation-anthropic | PyPI Version | | Smolagents | openinference-instrumentation-smolagents | PyPI Version | | Agno | openinference-instrumentation-agno | PyPI Version | | MCP | openinference-instrumentation-mcp | PyPI Version | | Pydantic AI | openinference-instrumentation-pydantic-ai | PyPI Version | | Autogen AgentChat | openinference-instrumentation-autogen-agentchat | PyPI Version | | Portkey | openinference-instrumentation-portkey | PyPI Version |

JavaScript Integrations

| Integration | Package | Version Badge | | ------------------------------------------------------------------------------------------ | -------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | OpenAI | @arizeai/openinference-instrumentation-openai | NPM Version | | LangChain.js | @arizeai/openinference-instrumentation-langchain | NPM Version | | Vercel AI SDK | @arizeai/openinference-vercel | NPM Version | | BeeAI | @arizeai/openinference-instrumentation-beeai | NPM Version | | Mastra | @arizeai/openinference-mastra | NPM Version |

Platforms

Phoenix has native integrations with LangFlow, LiteLLM Proxy, and BeeAI.

Community

Join our community to connect with thousands of AI builders.

Breaking Changes

See the migration guide for a list of breaking changes.

Copyright, Patent, and License

Copyright 2025 Arize AI, Inc. All Rights Reserved.

Portions of this code are patent protected by one or more U.S. Patents. See the IP_NOTICE.

This software is licensed under the terms of the Elastic License 2.0 (ELv2). See LICENSE.