Why typed exceptions?
LLM providers format errors differently (status codes, messages, exception classes). The SDK maps those into stable types so client apps don’t depend on provider‑specific details. Typical benefits:- One code path to handle auth, rate limits, timeouts, service issues, and bad requests
- Clear behavior when conversation history exceeds the context window
- Backward compatibility when you switch providers or SDK versions
Quick start: Using agents and conversations
Agent-driven conversations are the common entry point. Exceptions from the underlying LLM calls bubble up fromconversation.run() and conversation.send_message(...) when a condenser is not configured.
Avoiding context‑window errors with a condenser
If a condenser is configured, the SDK emits a condensation request event instead of raisingLLMContextWindowExceedError. The agent will summarize older history and continue.
Handling errors with direct LLM calls
The same exceptions are raised from bothLLM.completion() and LLM.responses() paths, so you can share handlers.
Example: Using completion()
Example: Using responses()
Exception reference
All exceptions live underopenhands.sdk.llm.exceptions unless noted.
-
Provider/transport mapping (provider‑agnostic):
LLMContextWindowExceedError— Conversation exceeds the model’s context window. Without a condenser, thrown for both Chat and Responses paths.LLMAuthenticationError— Invalid or missing credentials (401/403 patterns).LLMRateLimitError— Provider rate limit exceeded.LLMTimeoutError— SDK/lower‑level timeout while waiting for the provider.LLMServiceUnavailableError— Temporary connectivity/service outage (e.g., 5xx, connection issues).LLMBadRequestError— Client‑side request issues (invalid params, malformed input).
-
Response parsing/validation:
LLMMalformedActionError— Model returned a malformed action.LLMNoActionError— Model did not return an action when one was expected.LLMResponseError— Could not extract an action from the response.FunctionCallConversionError— Failed converting tool/function call payloads.FunctionCallValidationError— Tool/function call arguments failed validation.FunctionCallNotExistsError— Model referenced an unknown tool/function.LLMNoResponseError— Provider returned an empty/invalid response (seen rarely, e.g., some Gemini models).
-
Cancellation:
UserCancelledError— A user aborted the operation.OperationCancelled— A running operation was cancelled programmatically.
LLMError, so you can implement a catch‑all for unexpected SDK LLM errors while still keeping fine‑grained handlers for the most common cases.
