play_circle Quickstart

The smallest end-to-end workflow is: initialize a thinker, store a fact, and query it back. The example below uses Anthropic, but any supported provider works the same way.

code Quickstart
import os
os.environ["ANTHROPIC_API_KEY"] = "your_key_here"

from analogai.deepthink.thinker import DeepThinker
from analogai.deepthink.infrastructure.gateways.llm.enums.llm_provider import LlmProvider

thinker = DeepThinker(
    llm_provider=LlmProvider.ANTHROPIC,
    llm_model="claude-haiku-4-5-20251001",
)

thinker.remember("Paris is the capital of France.")
answer = thinker.search("What is the capital of France?")
print(answer)

To build richer memory, call remember multiple times and then use generate_context to obtain a curated context block for your own LLM calls.

Next: explore Main Operations and LLM Setup.