AzureStor (version 2.1.1)

list_blobs: Operations on a blob container or blob

Description

Upload, download, or delete a blob; list blobs in a container.

Usage

list_blobs(container, info = c("partial", "name", "all"),
  prefix = NULL)

upload_blob(container, src, dest, type = "BlockBlob", blocksize = 2^24, lease = NULL, use_azcopy = FALSE)

multiupload_blob(container, src, dest, type = "BlockBlob", blocksize = 2^24, lease = NULL, use_azcopy = FALSE, max_concurrent_transfers = 10)

download_blob(container, src, dest, blocksize = 2^24, overwrite = FALSE, lease = NULL, use_azcopy = FALSE)

multidownload_blob(container, src, dest, blocksize = 2^24, overwrite = FALSE, lease = NULL, use_azcopy = FALSE, max_concurrent_transfers = 10)

delete_blob(container, blob, confirm = TRUE)

copy_url_to_blob(container, src, dest, lease = NULL, async = FALSE)

Arguments

container

A blob container object.

info

For list_blobs, level of detail about each blob to return: a vector of names only; the name, size and last-modified date (default); or all information.

prefix

For list_blobs, filters the result to return only blobs whose name begins with this prefix.

src, dest

The source and destination files for uploading and downloading. See 'Details' below.

type

When uploading, the type of blob to create. Currently only block blobs are supported.

blocksize

The number of bytes to upload/download per HTTP(S) request.

lease

The lease for a blob, if present.

use_azcopy

Whether to use the AzCopy utility from Microsoft to do the transfer, rather than doing it in R.

max_concurrent_transfers

For multiupload_blob and multidownload_blob, the maximum number of concurrent file transfers. Each concurrent file transfer requires a separate R process, so limit this if you are low on memory.

overwrite

When downloading, whether to overwrite an existing destination file.

blob

A string naming a blob.

confirm

Whether to ask for confirmation on deleting a blob.

async

For copy_url_to_blob, whether the copy operation should be asynchronous (proceed in the background).

Value

For list_blobs, details on the blobs in the container. For download_blob, if dest=NULL, the contents of the downloaded blob as a raw vector.

Details

upload_blob and download_blob are the workhorse file transfer functions for blobs. They each take as inputs a single filename or connection as the source for uploading/downloading, and a single filename as the destination.

multiupload_blob and multidownload_blob are functions for uploading and downloading multiple blobs at once. They parallelise file transfers by deploying a pool of R processes in the background, which can lead to significantly greater efficiency when transferring many small files. They take as input a wildcard pattern as the source, which expands to one or more files. The dest argument should be a directory.

The file transfer functions also support working with connections to allow transferring R objects without creating temporary files. For uploading, src can be a textConnection or rawConnection object. For downloading, dest can be NULL or a rawConnection object. In the former case, the downloaded data is returned as a raw vector, and for the latter, it will be placed into the connection. See the examples below.

By default, the upload and download functions will display a progress bar to track the file transfer. To turn this off, use options(azure_storage_progress_bar=FALSE). To turn the progress bar back on, use options(azure_storage_progress_bar=TRUE).

copy_url_to_blob transfers the contents of the file at the specified HTTP[S] URL directly to blob storage, without requiring a temporary local copy to be made. This has a current file size limit of 256MB.

See Also

blob_container, az_storage, storage_download, call_azcopy

AzCopy version 10 on GitHub

Examples

Run this code
# NOT RUN {
cont <- blob_container("https://mystorage.blob.core.windows.net/mycontainer", key="access_key")

list_blobs(cont)

upload_blob(cont, "~/bigfile.zip", dest="bigfile.zip")
download_blob(cont, "bigfile.zip", dest="~/bigfile_downloaded.zip")

delete_blob(cont, "bigfile.zip")

# uploading/downloading multiple files at once
multiupload_blob(cont, "/data/logfiles/*.zip", "/uploaded_data")
multiupload_blob(cont, "myproj/*")  # no dest directory uploads to root
multidownload_blob(cont, "jan*.*", "/data/january")

# uploading serialized R objects via connections
json <- jsonlite::toJSON(iris, pretty=TRUE, auto_unbox=TRUE)
con <- textConnection(json)
upload_blob(cont, con, "iris.json")

rds <- serialize(iris, NULL)
con <- rawConnection(rds)
upload_blob(cont, con, "iris.rds")

# downloading files into memory: as a raw vector, and via a connection
rawvec <- download_blob(cont, "iris.json", NULL)
rawToChar(rawvec)

con <- rawConnection(raw(0), "r+")
download_blob(cont, "iris.rds", con)
unserialize(con)

# copy from a public URL: Iris data from UCI machine learning repository
copy_url_to_blob(cont,
    "https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data",
    "iris.csv")

# }

Run the code above in your browser using DataLab