LLMAgentR
Overview
LLMAgentR is an R package for building Language Model Agents using a modular state graph execution framework. Inspired by LangGraph and LangChain architectures, it supports iterative workflows for research, data analysis, and automation.
Installation
install.packages("LLMAgentR")Development version
To get the latest features or bug fixes, you can install the development
version of LLMAgentR from GitHub:
# If needed
install.packages("remotes")
remotes::install_github("knowusuboaky/LLMAgentR")See the full function reference or the package website for more details.
Environment Setup
API Setup
Sys.setenv(
OPENAI_API_KEY = "your-openai-key",
GROQ_API_KEY = "your-groq-key",
ANTHROPIC_API_KEY = "your-anthropic-key",
DEEPSEEK_API_KEY = "your-deepseek-key",
DASHSCOPE_API_KEY = "your-dashscope-key",
GH_MODELS_TOKEN = "your-github-models-token"
)LLM Support (Minimal Wrapper)
The chatLLM package allows you to interact with large language models
(LLMs) effortlessly - either through direct calls or via reusable
minimal wrappers.
Load the Package
library(chatLLM)Minimal Wrapper Function
Create a lightweight wrapper around call_llm() for reuse. It
optionally provides verbose output:
call_llm(
prompt = "Summarize the capital of France.",
provider = "groq",
model = "llama3-8b",
temperature = 0.7,
max_tokens = 200,
verbose = TRUE
)
my_llm_wrapper <- function(prompt, verbose = FALSE) {
if (verbose) {
message("[my_llm_wrapper] Sending prompt to LLM...")
}
# Suppress console output but always return the response
response_text <- if (verbose) {
call_llm(
prompt = prompt,
provider = "openai",
model = "gpt-4o",
max_tokens = 3000,
verbose = TRUE
)
} else {
suppressMessages(
suppressWarnings(
call_llm(
prompt = prompt,
provider = "openai",
model = "gpt-4o",
max_tokens = 3000,
verbose = TRUE
)
)
)
}
if (verbose) {
message("[my_llm_wrapper] Response received.")
}
return(response_text)
}Quick Access Shortcut
Alternatively, preconfigure an LLM call for one-liners:
my_llm_wrapper <- call_llm(
provider = "openai",
model = "gpt-4o",
max_tokens = 3000,
verbose = TRUE
)Related Package: chatLLM
The chatLLM package (now
available on CRAN) offers a modular interface for interacting with LLM
providers including OpenAI, Groq, Anthropic, DeepSeek,
DashScope, and GitHub Models.
install.packages("chatLLM")Agent Articles
Detailed guides now live in pkgdown Articles (one per agent):
- Code Generation Agent
- SQL Query Agent
- Research Agent
- Interpreter Agent
- Document Summarizer Agent
- Data Cleaning Agent
- Forecasting Agent
- Data Wrangling Agent
- Weather Agent
- Feature Engineering Agent
- Visualization Agent
Custom graph workflows:
A full index page is also available:
License
MIT (c) Kwadwo Daddy Nyame Owusu Boakye