A vector store automatically chunks, embeds, and indexes files so that
an Assistant with the file_search tool can search over them using
natural language queries.
$filesVectorStoreFilesClient — Add/remove files from a store
$file_batchesVectorStoreFileBatchesClient — Batch-add files
Upload files: client$files$create(file, purpose = "assistants")
Create vector store: client$vector_stores$create(name = "...")
Add files: client$vector_stores$files$create(store_id, file_id)
Attach to assistant via tool_resources in client$assistants$create()
new()VectorStoresClient$new(parent)
create()VectorStoresClient$create(
name = NULL,
file_ids = NULL,
expires_after = NULL,
chunking_strategy = NULL,
metadata = NULL
)
list()VectorStoresClient$list(
limit = NULL,
order = NULL,
after = NULL,
before = NULL
)
retrieve()VectorStoresClient$retrieve(vector_store_id)
update()VectorStoresClient$update(vector_store_id, ...)
delete()VectorStoresClient$delete(vector_store_id)
search()VectorStoresClient$search(
vector_store_id,
query,
filter = NULL,
max_num_results = NULL,
ranking_options = NULL,
rewrite_query = NULL
)
clone()The objects of this class are cloneable with this method.
VectorStoresClient$clone(deep = FALSE)deepWhether to make a deep clone.
Client for the OpenAI Vector Stores API v2 (Beta).
Vector stores enable semantic file search for Assistants.
Access via client$vector_stores.