Learn R Programming

edgemodelr (version 0.1.6)

edge_chat_stream: Interactive chat session with streaming responses

Description

Interactive chat session with streaming responses

Usage

edge_chat_stream(ctx, system_prompt = NULL, max_history = 10, n_predict = 200L,
                 temperature = 0.8, verbose = TRUE)

Value

NULL (runs interactively)

Arguments

ctx

Model context from edge_load_model()

system_prompt

Optional system prompt to set context

max_history

Maximum conversation turns to keep in context (default: 10)

n_predict

Maximum tokens per response (default: 200)

temperature

Sampling temperature (default: 0.8)

verbose

Whether to print responses to console (default: TRUE)

Examples

Run this code
if (FALSE) {
# Requires a downloaded model (not run in checks)
setup <- edge_quick_setup("TinyLlama-1.1B")
ctx <- setup$context

if (!is.null(ctx)) {
  # Start interactive chat with streaming
  edge_chat_stream(ctx,
    system_prompt = "You are a helpful R programming assistant.")

  edge_free_model(ctx)
}
}

Run the code above in your browser using DataLab