Character: The name of the LLM model to use. Must be an Ollama model.
temperature
Numeric: The temperature for the model.
base_url
Character: Base URL of Ollama server.
think
Optional Logical or Character {"low", "medium", "high"}: Default thinking mode
for this config. Logical values target models like deepseek or qwen3; character values target
gpt-oss. Can be overridden per call.