powered by
Decodes tokens back to text
decode_tokens(tokens, model)
a character string of the decoded tokens or a vector or strings
a vector of tokens to decode, or a list of tokens
a model to use for tokenization, either a model name, e.g., gpt-4o or a tokenizer, e.g., o200k_base. See also available tokenizers.
gpt-4o
o200k_base
model_to_tokenizer(), get_tokens()
model_to_tokenizer()
get_tokens()
tokens <- get_tokens("Hello World", "gpt-4o") tokens decode_tokens(tokens, "gpt-4o") tokens <- get_tokens(c("Hello World", "Alice Bob Charlie"), "gpt-4o") tokens decode_tokens(tokens, "gpt-4o")
Run the code above in your browser using DataLab