Learn R Programming

LLMR (version 0.4.2)

call_llm_sweep: Mode 1: Parameter Sweep - Vary One Parameter, Fixed Message

Description

Sweeps through different values of a single parameter while keeping the message constant. Perfect for hyperparameter tuning, temperature experiments, etc. This function requires setting up the parallel environment using `setup_llm_parallel`.

Usage

call_llm_sweep(base_config, param_name, param_values, messages, ...)

Value

A tibble with columns: swept_param_name, the varied parameter column, provider, model, all other model parameters, response_text, raw_response_json, success, error_message.

Arguments

base_config

Base llm_config object to modify.

param_name

Character. Name of the parameter to vary (e.g., "temperature", "max_tokens").

param_values

Vector. Values to test for the parameter.

messages

List of message objects (same for all calls).

...

Additional arguments passed to `call_llm_par` (e.g., tries, verbose, progress).

Examples

Run this code
if (FALSE) {
  # Temperature sweep
  config <- llm_config(provider = "openai", model = "gpt-4o-mini",
                       api_key = Sys.getenv("OPENAI_API_KEY"))

  messages <- list(list(role = "user", content = "What is 15 * 23?"))
  temperatures <- c(0, 0.3, 0.7, 1.0, 1.5)

  setup_llm_parallel(workers = 4, verbose = TRUE)
  results <- call_llm_sweep(config, "temperature", temperatures, messages)
  reset_llm_parallel(verbose = TRUE)
}

Run the code above in your browser using DataLab