
Last chance! 50% off unlimited learning
Sale ends in
This function uses the OpenAI API to interact with the gpt-4o-mini model (default) and generates responses based on user input. In this function, currently, "gpt-4o-mini", "gpt-4o", "gpt-4", "gpt-4-turbo" and "gpt-3.5-turbo" can be selected as OpenAI's LLM model.
chat4R(
content,
Model = "gpt-4o-mini",
temperature = 1,
simple = TRUE,
fromJSON_parsed = FALSE,
api_key = Sys.getenv("OPENAI_API_KEY")
)
A data frame containing the response from the GPT model.
A string containing the user's input message.
A string specifying the GPT model to use (default: "gpt-4o-mini").
A numeric value controlling the randomness of the model's output (default: 1).
Logical, if TRUE, only the content of the model's message will be returned.
Logical, if TRUE, content will be parsed from JSON.
A string containing the user's OpenAI API key. Defaults to the value of the environment variable "OPENAI_API_KEY".
Satoshi Kume
Chat4R Function
if (FALSE) {
Sys.setenv(OPENAI_API_KEY = "Your API key")
response <- chat4R(content = "What is the capital of France?")
response
}
Run the code above in your browser using DataLab