article Context Engineering

Use Deepthink to build a context block you can append to an existing agent prompt. This lets you keep your current LLM provider while improving grounding. See Quickstart for setup.

code Context Engineering
thinker.remember("Paris is the capital of France.")
thinker.remember("France is in Europe.")

context = thinker.generate_context("Tell me about France.")
prompt = "You are a helpful assistant.

" + context + "

" + user_input

response = your_llm_provider.chat(prompt)
print(response)

This pattern reduces hallucinations by grounding the model with curated memory.