Skip to main content
Agent Stack includes built-in observability through OpenTelemetry (OTLP), with Arize Phoenix available out-of-the-box for immediate use. Monitor your agents through logging, telemetry, and integration with external monitoring systems.

View Agent Logs

Stream real-time logs from any running agent:
agentstack logs
What you’ll see:
  • Agent startup and initialization
  • Request processing steps
  • Error messages and stack traces
  • Container lifecycle events
Logs are only available for managed (containerized) agents that are currently running.

Telemetry Collection

Agent Stack includes OpenTelemetry instrumentation to collect traces and metrics. Telemetry data helps with performance monitoring, error tracking, usage analytics, and debugging agent interactions. By default, Agent Stack sends telemetry to:
  • Local Phoenix instance (if running) for trace visualization
The telemetry includes:
  • Platform version and runtime details
  • Agent execution traces

Quickstart: Enable Phoenix Observability

Arize Phoenix provides visualization for OpenTelemetry traces from your agents.
Important License Notice: Phoenix is disabled by default in Agent Stack. When you enable Phoenix, be aware that Arize Phoenix is licensed under the Elastic License v2 (ELv2), which has specific terms regarding commercial use and distribution. By enabling Phoenix, you acknowledge that you are responsible for ensuring compliance with the ELv2 license terms for your specific use case. Please review the Phoenix license before enabling this feature in production environments.
1

Install and Enable Arize Phoenix

Install and start Phoenix using the agentstack platform start command:
agentstack platform start --set phoenix.enabled=true
You can run this even if your platform is already running without losing data.
2

Check if Phoenix is running

Spinning up Phoenix can take a while, even after the platform start command reports success. Go to http://localhost:6006 and check if it’s running. If not, please wait a few minutes or check your internet connection.
3

Run Agent with Phoenix

Execute the following command to run an example chat agent:
agentstack run chat "Hello"
4

View Traces in Phoenix

Open http://localhost:6006 in your browser and navigate to the default project to explore the collected traces.
For an enhanced user experience and richer trace detail, consider instrumenting agents using the OpenInference standard for custom instrumentation.

Advanced Configuration

Configure Langfuse Integration

Langfuse is an LLM observability platform that can be integrated with the Agent Stack through OpenTelemetry.
1

Get Langfuse credentials

  1. Sign up at cloud.langfuse.com
  2. Create a project and generate API keys
  3. Encode your keys: echo -n "public_key:secret_key" | base64
2

Create a configuration file (config.yaml):

collector:
  exporters:
    otlphttp/langfuse:
      endpoint: "https://cloud.langfuse.com/api/public/otel" # EU data region
      headers:
        Authorization: "Basic <auth-string>"
  pipelines:
    traces:
      receivers: [ otlp ]
      processors: [ memory_limiter, filter/phoenix, batch ]
      exporters: [ otlphttp/langfuse ]
3

Start the platform with the configuration

agentstack platform start -f config.yaml
4

Access Langfuse UI

Check your Langfuse project dashboard for incoming traces and metrics.

Additional Resources