powered by
Creates an OpenAIConfig object which can be passed to create_agent()
create_agent()
config_OpenAI( model_name, temperature = TEMPERATURE_DEFAULT, base_url = OPENAI_URL_DEFAULT, api_key = NULL, api_key_env = OPENAI_API_KEY_ENV_DEFAULT, keychain_service = NULL, organization = NULL, project = NULL, timeout = OPENAI_TIMEOUT_DEFAULT, extra_headers = NULL, extra_body = NULL, enable_thinking = NULL, validate_model = FALSE )
OpenAIConfig object
Character: The name of the LLM model to use.
Numeric [0, 2]: The temperature for the model.
Character: Base URL of the OpenAI-compatible server.
Optional character: API key.
Character: Environment variable containing the API key.
Optional character: macOS Keychain service containing the API key.
Optional character: OpenAI organization id.
Optional character: OpenAI project id.
Numeric (0, Inf): Request timeout in seconds.
Optional list: Additional HTTP headers.
Optional list: Additional request body fields.
Optional logical: Whether to enable model thinking for compatible local servers.
Logical: Whether to validate model availability using the models endpoint.
EDG
cfg <- config_OpenAI( model_name = "local-model", temperature = 0.4, base_url = "http://localhost:1234/v1/", validate_model = FALSE )
Run the code above in your browser using DataLab