ProviderFactory - creates and configures LLM providers.
Responsibilities:
- Endpoint resolution (including Jinja template evaluation)
- Development overrides (endpoint switching, tool toggles)
- Playback from log files
- SystemProxyProvider wrapping for assistant mode
- Tool configuration
ProviderWithConfig
Result of creating a provider, includes both provider and config.
ProviderFactory
Factory for creating and configuring LLM providers.
This class consolidates:
- get_assistant_provider
- _get_assisted_provider
- _get_endpoint_name_for_assistant
- _get_endpoint_config_for_provider
- _apply_dev_overrides
__init__
__init__(assistant_name: str, endpoints: dict[str, Any], session_getter: Callable[[], dict[str, Any]], debug_mode: bool = False, interview_mode: str = 'standard', question_definitions: dict[str, Any] | None = None) -> None
Initialize the provider factory.
| Parameters: |
-
assistant_name
(str)
–
-
endpoints
(dict[str, Any])
–
Dict of endpoint configurations (from app.LLM_ENDPOINTS)
-
session_getter
(Callable[[], dict[str, Any]])
–
Callable that returns the Flask session dict
-
debug_mode
(bool, default:
False
)
–
Whether the app is in debug mode
-
interview_mode
(str, default:
'standard'
)
–
Interview mode ("standard" or "assistant")
-
question_definitions
(dict[str, Any] | None, default:
None
)
–
Optional dict of question definitions
|
resolve_endpoint_name
resolve_endpoint_name(endpoint_template: str) -> str
Resolve endpoint name from template string.
| Parameters: |
-
endpoint_template
(str)
–
Endpoint name or Jinja template string
|
get_endpoint_config
get_endpoint_config(endpoint_name: str) -> dict[str, Any]
Get the full endpoint configuration.
| Returns: |
-
dict[str, Any]
–
Endpoint configuration dict
|
| Raises: |
-
ValueError
–
If endpoint is not defined
|
create
create(endpoint_template: str, get_pending_questions: Callable[[], list[str]] | None = None) -> ProviderWithConfig
Create a configured provider instance.
| Parameters: |
-
endpoint_template
(str)
–
Endpoint name or Jinja template
-
get_pending_questions
(Callable[[], list[str]] | None, default:
None
)
–
Optional callback for SystemProxyProvider
(required if interview_mode == "assistant")
|
configure_tools(provider: LLMProvider, unanswered_questions: list[dict[str, Any]]) -> None
Configure tools on a provider.
| Parameters: |
-
provider
(LLMProvider)
–
Provider instance to configure
-
unanswered_questions
(list[dict[str, Any]])
–
List of unanswered predefined questions
|
create_assisted_provider
create_assisted_provider(endpoint_name: str) -> LLMProvider
Create a provider for assisted text generation (tools disabled).
| Raises: |
-
ValueError
–
If endpoint not found and no default
|