smart_toy LLM Setup

Set the provider key that matches the LLM you choose. Supported providers include Anthropic, OpenAI, Azure OpenAI, Bedrock, and XAI.

code LLM Keys
OPENAI_API_KEY=your_key_here
ANTHROPIC_API_KEY=your_key_here
XAI_API_KEY=your_key_here

Initialize Deepthink using llm_provider and llm_model, or pass your own gateway via llm_gateway_instance.

For custom deployments, implement the LlmGateway interface and inject it at runtime. This is useful when you already have a provider client and want to reuse it.

extension Custom LLM Provider

Create a class that implements the gateway interface, then pass an instance into DeepThinker as llm_gateway_instance. You can still set llm_model on your gateway if it supports it.

code Custom Gateway
from analogai.deepthink.infrastructure.gateways.llm.gateway import LlmGateway

class MyGateway(LlmGateway):
    def __init__(self, client):
        self._client = client
        self._system_prompt = ""

    def set_system_prompt(self, prompt: str) -> None:
        self._system_prompt = prompt

    def set_model(self, model: str) -> None:
        self._model = model

    def generate_completion(self, user_message: str) -> str:
        # use your provider SDK here
        return self._client.complete(user_message, system=self._system_prompt)

    async def generate_streaming_completion(self, user_message: str):
        yield self.generate_completion(user_message)

custom_gateway = MyGateway(client=your_client)

thinker = DeepThinker(
    llm_gateway_instance=custom_gateway,
    llm_model="your-model-id",
)

See Quickstart for a working example.