Overview
The OpenHands SDK provides built-in OpenTelemetry (OTEL) tracing support, allowing you to monitor and debug your agent’s execution in real-time. You can send traces to any OTLP-compatible observability platform including:- Laminar - AI-focused observability with browser session replay support
- Honeycomb - High-performance distributed tracing
- Any OTLP-compatible backend - Including Jaeger, Datadog, New Relic, and more
- Agent execution steps
- Tool calls and executions
- LLM API calls (via LiteLLM integration)
- Browser automation sessions (when using browser-use)
- Conversation lifecycle events
Quick Start
Tracing is automatically enabled when you set the appropriate environment variables. The SDK detects the configuration on startup and initializes tracing without requiring code changes.Using Laminar
Laminar provides specialized AI observability features including browser session replays when using browser-use tools:Using Honeycomb or Other OTLP Backends
For Honeycomb, Jaeger, or any other OTLP-compatible backend:Alternative Configuration Methods
You can also use these alternative environment variable formats:How It Works
The OpenHands SDK uses the Laminar SDK as its OpenTelemetry instrumentation layer. When you set the environment variables, the SDK:- Detects Configuration: Checks for OTEL environment variables on startup
- Initializes Tracing: Configures OpenTelemetry with the appropriate exporter
- Instruments Code: Automatically wraps key functions with tracing decorators
- Captures Context: Associates traces with conversation IDs for session grouping
- Exports Spans: Sends trace data to your configured backend
What Gets Traced
The SDK automatically instruments these components:agent.step- Each iteration of the agent’s execution loop- Tool Executions - Individual tool calls with input/output capture
- LLM Calls - API requests to language models via LiteLLM
- Conversation Lifecycle - Message sending, conversation runs, and title generation
- Browser Sessions - When using browser-use, captures session replays (Laminar only)
Trace Hierarchy
Traces are organized hierarchically:Configuration Reference
Environment Variables
The SDK checks for these environment variables (in order of precedence):| Variable | Description | Example |
|---|---|---|
LMNR_PROJECT_API_KEY | Laminar project API key | your-laminar-api-key |
OTEL_EXPORTER_OTLP_TRACES_ENDPOINT | Full OTLP traces endpoint URL | https://api.honeycomb.io:443/v1/traces |
OTEL_EXPORTER_OTLP_ENDPOINT | Base OTLP endpoint (traces path appended) | http://localhost:4317 |
OTEL_ENDPOINT | Short form endpoint | http://localhost:4317 |
OTEL_EXPORTER_OTLP_TRACES_HEADERS | Authentication headers for traces | x-honeycomb-team=YOUR_API_KEY |
OTEL_EXPORTER_OTLP_HEADERS | General authentication headers | Authorization=Bearer%20TOKEN |
OTEL_EXPORTER_OTLP_TRACES_PROTOCOL | Protocol for traces endpoint | http/protobuf, grpc |
OTEL_EXPORTER | Short form protocol | otlp_http, otlp_grpc |
Header Format
Headers should be comma-separatedkey=value pairs with URL encoding for special characters:
Protocol Options
The SDK supports both HTTP and gRPC protocols:http/protobuforotlp_http- HTTP with protobuf encoding (recommended for most backends)grpcorotlp_grpc- gRPC with protobuf encoding (use only if your backend supports gRPC)
Platform-Specific Configuration
Laminar Setup
- Sign up at laminar.sh
- Create a project and copy your API key
- Set the environment variable:
Honeycomb Setup
- Sign up at honeycomb.io
- Get your API key from the account settings
- Configure the environment:
Jaeger Setup
For local development with Jaeger:Generic OTLP Collector
For other backends, use their OTLP endpoint:Advanced Usage
Disabling Observability
To disable tracing, simply unset all OTEL environment variables:Custom Span Attributes
The SDK automatically adds these attributes to spans:conversation_id- UUID of the conversationtool_name- Name of the tool being executedaction.kind- Type of action being performedsession_id- Groups all traces from one conversation
Debugging Tracing Issues
If traces aren’t appearing in your observability platform:-
Verify Environment Variables:
-
Check SDK Logs: The SDK logs observability initialization at debug level:
-
Test Connectivity: Ensure your application can reach the OTLP endpoint:
- Validate Headers: Check that authentication headers are properly URL-encoded
Example: Full Setup
This example is available on GitHub: examples/01_standalone_sdk/27_observability_laminar.py
examples/01_standalone_sdk/27_observability_laminar.py
Running the Example
Troubleshooting
Traces Not Appearing
Problem: No traces showing up in observability platform Solutions:- Verify environment variables are set correctly
- Check network connectivity to OTLP endpoint
- Ensure authentication headers are valid
- Look for SDK initialization logs at debug level
High Trace Volume
Problem: Too many spans being generated Solutions:- Configure sampling at the collector level
- For Laminar with non-browser tools, browser instrumentation is automatically disabled
- Use backend-specific filtering rules
Performance Impact
Problem: Concerned about tracing overhead Solutions:- Tracing has minimal overhead when properly configured
- Disable tracing in development by unsetting environment variables
- Use asynchronous exporters (default in most OTLP configurations)
Next Steps
- Metrics Tracking - Monitor token usage and costs alongside traces
- LLM Registry - Track multiple LLMs used in your application
- Security - Add security validation to your traced agent executions

