vibe.llm_providers.anthropic¶
Anthropic LLM provider integration.
AnthropicConverter ¶
Converter for Anthropic Claude API format.
Anthropic has specific requirements: - System messages passed separately (not in messages array) - Messages must alternate between user and assistant roles - Tool calls are content blocks within assistant messages - Tool results are content blocks within user messages
This converter returns (system_prompt, messages) tuple from convert_all.
convert_system ¶
convert_system(msg: SystemMessage) -> str
Return system content as string (accumulated separately).
convert_user ¶
convert_user(msg: UserMessage) -> dict[str, Any]
Convert user message to Anthropic user role format.
convert_assistant ¶
convert_assistant(msg: AssistantMessage) -> dict[str, Any]
Convert assistant message with optional tool_use content blocks.
convert_tool_result ¶
convert_tool_result(msg: ToolResult) -> dict[str, Any]
Convert tool result to Anthropic tool_result content block in user message.
convert_all ¶
AnthropicProvider ¶
LLMProvider implementation for Anthropic's Claude 3 models.
Configuration options (common - via ProviderConfig): - api_key: Your Anthropic API key (required) - model: The model name (e.g., "claude-3-sonnet-20240229") - max_tokens: The maximum number of tokens to generate (default: 4096) - temperature: Controls randomness (0.0 to 1.0, default: 0.7) - tools: Enable tool calling (default: True)
convert_messages_to_provider_format ¶
convert_messages_to_provider_format(messages: list[Message], tools: list[Tool] | None = None) -> tuple[str | None, list[dict[str, Any]]]
Convert internal messages to Anthropic native format.
Uses AnthropicConverter for clean separation of conversion logic.
| Parameters: |
|---|
| Returns: |
|---|
stream_generate ¶
stream_generate(messages: list[Message], sequence_number: int, *, session_id: str, assistant_name: str, endpoint_name: str, turn_id: str, previous_response_id: str | None = None, tool_outputs: list[ToolOutput] | None = None, unanswered_predefined_questions: list[dict[str, Any]] | None = None) -> Generator[StreamChunk, None, None]
Stream LLM responses with tool call support.
| Parameters: |
|
|---|
| Yields: |
|
|---|
get_capabilities ¶
get_capabilities() -> ProviderCapabilities
Return Anthropic provider capability flags for feature gating.
get_usage_stats ¶
get_usage_stats() -> UsageStats | None
Return usage statistics from the last API call.