View Agent Logs
Stream real-time logs from any running agent:- Agent startup and initialization
- Request processing steps
- Error messages and stack traces
- Container lifecycle events
Logs are only available for managed (containerized) agents that are currently running.
Telemetry Collection
Agent Stack includes OpenTelemetry instrumentation to collect traces and metrics. Telemetry data helps with performance monitoring, error tracking, usage analytics, and debugging agent interactions. By default, Agent Stack sends telemetry to:- Local Phoenix instance (if running) for trace visualization
- Platform version and runtime details
- Agent execution traces
Quickstart: Enable Phoenix Observability
Arize Phoenix provides visualization for OpenTelemetry traces from your agents.Important License Notice: Phoenix is disabled by default in Agent Stack. When you enable Phoenix, be aware that Arize Phoenix is licensed under the Elastic License v2 (ELv2), which has specific terms regarding commercial use and distribution. By enabling Phoenix, you acknowledge that you are responsible for ensuring compliance with the ELv2 license terms for your specific use case. Please review the Phoenix license before enabling this feature in production environments.
1
Install and Enable Arize Phoenix
Install and start Phoenix using the You can run this even if your platform is already running without losing data.
agentstack platform start command:2
Check if Phoenix is running
Spinning up Phoenix can take a while, even after the
platform start command reports success. Go to http://localhost:6006 and check if it’s running. If not, please wait a few minutes or check your internet connection.3
Run Agent with Phoenix
Execute the following command to run an example chat agent:
4
View Traces in Phoenix
Open http://localhost:6006 in your browser and navigate to the default project to explore the collected traces.
For an enhanced user experience and richer trace detail, consider instrumenting agents using the OpenInference standard for custom instrumentation.
Advanced Configuration
Configure Langfuse Integration
Langfuse is an LLM observability platform that can be integrated with the Agent Stack through OpenTelemetry.1
Get Langfuse credentials
- Sign up at cloud.langfuse.com
- Create a project and generate API keys
- Encode your keys:
echo -n "public_key:secret_key" | base64
2
Create a configuration file (config.yaml):
3
Start the platform with the configuration
4
Access Langfuse UI
Check your Langfuse project dashboard for incoming traces and metrics.
Additional Resources
- OpenTelemetry Docs: https://opentelemetry.io/docs/
- Langfuse Docs: https://langfuse.com/docs
- Phoenix Docs: https://docs.arize.com/phoenix
- Prometheus Docs: https://prometheus.io/docs/
- Grafana Docs: https://grafana.com/docs/