Learn R Programming

⚠️There's a newer version (0.3.4) of this package.Take me there.

tidyllm (version 0.1.0)

Tidy Integration of Large Language Models

Description

A tidy interface for integrating large language model (LLM) APIs such as 'Claude', 'ChatGPT', 'Groq', and local models via 'Ollama' into R workflows. The package supports text and media-based interactions, interactive message history, stateful rate limit handling, and a tidy, pipeline-oriented interface for streamlined integration into data workflows. Web services are available at , , , and .

Copy Link

Version

Install

install.packages('tidyllm')

Monthly Downloads

428

Version

0.1.0

License

MIT + file LICENSE

Maintainer

Eduard Brüll

Last Published

October 10th, 2024

Functions in tidyllm (0.1.0)

tidyllm-package

tidyllm: Tidy Integration of Large Language Models
update_rate_limit

Update the standard API rate limit info in the hidden .tidyllm_rate_limit_env environment
parse_duration_to_seconds

An internal function to parse the duration strings that OpenAI APIs return for ratelimit resets
ollama_list_models

Retrieve and return model information from the Ollama API
claude

Call the Anthropic API to interact with Claude models
LLMMessage

Large Language Model Message Class
generate_callback_function

Generate API-Specific Callback Function for Streaming Responses
initialize_api_env

Initialize or Retrieve API-specific Environment
last_reply

Retrieve Last Reply from an Assistant
df_llm_message

Convert a Data Frame to an LLMMessage Object
llm_message

Create or Update Large Language Model Message Object
chatgpt

Call the OpenAI API to interact with ChatGPT or o-reasoning models
wait_rate_limit

Wait for ratelimit restore times to ellapse if necessary
groq

Call the Groq API to interact with fast opensource models on Groq
perform_api_request

Perform an API request to interact with language models
rate_limit_info

Get the current rate limit information for all or a specific API
ollama

Send LLMMessage to ollama API