LiteLLM Client
Optional Dependency
LiteLLM is an optional dependency. Install with:
stirrup.clients.litellm_client
LiteLLM-based LLM client for multi-provider support.
This client uses LiteLLM to provide a unified interface to multiple LLM providers (OpenAI, Anthropic, Google, etc.) with automatic retries for transient failures.
Requires the litellm extra: pip install stirrup[litellm]
ChatMessage
ChatMessage = Annotated[
SystemMessage
| UserMessage
| AssistantMessage
| ToolMessage,
Field(discriminator=role),
]
Discriminated union of all message types, automatically parsed based on role field.
ContextOverflowError
Bases: Exception
Raised when LLM context window is exceeded (max_tokens or length finish_reason).
AssistantMessage
Bases: BaseModel
LLM response message with optional tool calls and token usage tracking.
LLMClient
Bases: Protocol
Protocol defining the interface for LLM client implementations.
Any LLM client must implement this protocol to work with the Agent class. Provides text generation with tool support and model capability inspection.
Reasoning
Bases: BaseModel
Extended thinking/reasoning content from models that support chain-of-thought reasoning.
TokenUsage
Bases: BaseModel
Token counts for LLM usage (input, output, reasoning tokens).
__add__
__add__(other: TokenUsage) -> TokenUsage
Add two TokenUsage objects together, summing each field independently.
Source code in src/stirrup/core/models.py
Tool
Bases: BaseModel
Tool definition with name, description, parameter schema, and executor function.
Generic over
P: Parameter model type (must be a Pydantic BaseModel, or None for parameterless tools) M: Metadata type (should implement Addable for aggregation; use None for tools without metadata)
Tools are simple, stateless callables. For tools requiring lifecycle management (setup/teardown, resource pooling), use a ToolProvider instead.
Example with parameters
class CalcParams(BaseModel): expression: str
calc_tool = ToolCalcParams, None)
Example without parameters
time_tool = ToolNone, None)
ToolCall
LiteLLMClient
LiteLLMClient(
model_slug: str,
max_tokens: int,
supports_audio_input: bool = False,
reasoning_effort: str | None = None,
kwargs: dict[str, Any] | None = None,
)
Bases: LLMClient
LiteLLM-based client supporting multiple LLM providers with unified interface.
Includes automatic retries for transient failures and token usage tracking.
Initialize LiteLLM client with model configuration and capabilities.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
model_slug
|
str
|
Model identifier for LiteLLM (e.g., 'anthropic/claude-3-5-sonnet-20241022') |
required |
max_tokens
|
int
|
Maximum context window size in tokens |
required |
supports_audio_input
|
bool
|
Whether the model supports audio inputs |
False
|
reasoning_effort
|
str | None
|
Reasoning effort level for extended thinking models (e.g., 'medium', 'high') |
None
|
kwargs
|
dict[str, Any] | None
|
Additional arguments to pass to LiteLLM completion calls |
None
|
Source code in src/stirrup/clients/litellm_client.py
generate
async
generate(
messages: list[ChatMessage], tools: dict[str, Tool]
) -> AssistantMessage
Generate assistant response with optional tool calls. Retries up to 3 times on timeout/connection errors.
Source code in src/stirrup/clients/litellm_client.py
to_openai_messages
Convert ChatMessage list to OpenAI-compatible message dictionaries.
Handles all message types: SystemMessage, UserMessage, AssistantMessage, and ToolMessage. Preserves reasoning content and tool calls for assistant messages.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
msgs
|
list[ChatMessage]
|
List of ChatMessage objects (System, User, Assistant, or Tool messages). |
required |
Returns:
| Type | Description |
|---|---|
list[dict[str, Any]]
|
List of message dictionaries ready for the OpenAI API. |
Raises:
| Type | Description |
|---|---|
NotImplementedError
|
If an unsupported message type is encountered. |
Source code in src/stirrup/clients/utils.py
to_openai_tools
Convert Tool objects to OpenAI function calling format.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
tools
|
dict[str, Tool]
|
Dictionary mapping tool names to Tool objects. |
required |
Returns:
| Type | Description |
|---|---|
list[dict[str, Any]]
|
List of tool definitions in OpenAI's function calling format. |
Example
tools = {"calculator": calculator_tool} openai_tools = to_openai_tools(tools)
Returns: [{"type": "function", "function": {"name": "calculator", ...}}]