Learn R Programming

⚠️There's a newer version (0.1.6) of this package.Take me there.

edgemodelr (version 0.1.5)

Local Large Language Model Inference Engine

Description

Enables R users to run large language models locally using 'GGUF' model files and the 'llama.cpp' inference engine. Provides a complete R interface for loading models, generating text completions, and streaming responses in real-time. Supports local inference without requiring cloud APIs or internet connectivity, ensuring complete data privacy and control. Based on the 'llama.cpp' project by Georgi Gerganov (2023) .

Copy Link

Version

Install

install.packages('edgemodelr')

Monthly Downloads

162

Version

0.1.5

License

MIT + file LICENSE

Issues

Pull Requests

Stars

Forks

Maintainer

Pawan Rama Mali

Last Published

January 29th, 2026

Functions in edgemodelr (0.1.5)

edge_set_verbose

Control llama.cpp logging verbosity
edge_quick_setup

Quick setup for a popular model
edge_list_models

List popular pre-configured models
test_ollama_model_compatibility

Test if an Ollama model blob can be used with edgemodelr
edge_load_model

Load a local GGUF model for inference
edgemodelr-package

edgemodelr: Local Large Language Model Inference Engine
is_valid_model

Check if model context is valid
edge_stream_completion

Stream text completion with real-time token generation
edge_small_model_config

Get optimized configuration for small language models
edge_load_ollama_model

Load an Ollama model by partial SHA-256 hash
.robust_download

Robust file download with retry and resume support
build_chat_prompt

Build chat prompt from conversation history
edge_benchmark

Performance benchmarking for model inference
edge_clean_cache

Clean up cache directory and manage storage
edge_chat_stream

Interactive chat session with streaming responses
edge_free_model

Free model context and release memory
edge_completion

Generate text completion using loaded model
edge_find_gguf_models

Find and prepare GGUF models for use with edgemodelr
.download_with_wget

Download using wget command
.download_with_r

Download using R's download.file with libcurl
.download_with_curl

Download using curl command
.is_valid_gguf_file

Check if a file is a valid GGUF file
edge_find_ollama_models

Find and load Ollama models
edge_download_model

Download a GGUF model from Hugging Face
edge_download_url

Download a model from a direct URL