gcs_upload

Upload a file of arbitrary type

Upload up to 5TB

Usage
gcs_upload(file, bucket = gcs_get_global_bucket(), type = NULL,
  name = deparse(substitute(file)), object_function = NULL,
  object_metadata = NULL, predefinedAcl = c("private", "authenticatedRead",
  "bucketOwnerFullControl", "bucketOwnerRead", "projectPrivate", "publicRead"),
  upload_type = c("simple", "resumable"))
Arguments
file
data.frame, list, R object or filepath (character) to upload file
bucket
bucketname you are uploading to
type
MIME type, guessed from file extension if NULL
name
What to call the file once uploaded. Default is the filepath
object_function
If not NULL, a function(input, output)
object_metadata
Optional metadata for object created via gcs_metadata_object
predefinedAcl
Specify user access to object. Default is 'private'
upload_type
Override automatic decision on upload type
Details

When using object_function it expects a function with two arguments:

  • input The object you supply in file to write from
  • output The filename you write to

By default the upload_type will be 'simple' if under 5MB, 'resumable' if over 5MB. 'Multipart' upload is used if you provide a object_metadata.

If object_function is NULL and file is not a character filepath, the defaults are:

If object_function is not NULL and file is not a character filepath, then object_function will be applied to the R object specified in file before upload. You may want to also use name to ensure the correct file extension is used e.g. name = 'myobject.feather'

If file or name argument contains folders e.g. /data/file.csv then the file will be uploaded with the same folder structure e.g. in a /data/ folder. Use name to override this.

Value

If successful, a metadata object

scopes

Requires scopes https://www.googleapis.com/auth/devstorage.read_write or https://www.googleapis.com/auth/devstorage.full_control

Aliases
  • gcs_upload
Examples

## Not run: ------------------------------------
# 
# ## set global bucket so don't need to keep supplying in future calls
# gcs_global_bucket("my-bucket")
# 
# ## by default will convert dataframes to csv
# gcs_upload(mtcars)
# 
# ## mtcars has been renamed to mtcars.csv
# gcs_list_objects()
# 
# ## to specify the name, use the name argument
# gcs_upload(mtcars, name = "my_mtcars.csv")
# 
# ## when looping, its best to specify the name else it will take
# ## the deparsed function call e.g. X[[i]]
# my_files <- list.files("my_uploads")
# lapply(my_files, function(x) gcs_upload(x, name = x))
# 
# ## you can supply your own function to transform R objects before upload
# f <- function(input, output){
#   write.csv2(input, file = output)
# }
# 
# gcs_upload(mtcars, name = "mtcars_csv2.csv", object_function = f)
# 
## ---------------------------------------------



Documentation reproduced from package googleCloudStorageR, version 0.3.0, License: MIT + file LICENSE

Community examples

Looks like there are no examples yet.