Save an EE ImageCollection in their local system
ee_imagecollection_to_local(
ic,
region,
dsn = NULL,
via = "drive",
container = "rgee_backup",
scale = NULL,
maxPixels = 1e+09,
lazy = FALSE,
public = TRUE,
add_metadata = TRUE,
timePrefix = TRUE,
quiet = FALSE,
...
)
ee$ImageCollection to be saved in the system.
EE Geometry (ee$Geometry$Polygon). The
CRS needs to be the same that the ic
argument otherwise it will be
forced.
Character. Output filename. If missing, a temporary file will be created for each image.
Character. Method to export the image. Two method are implemented: "drive", "gcs". See details.
Character. Name of the folder ('drive') or bucket ('gcs')
to be exported into (ignored if via
is not defined as "drive" or
"gcs").
Numeric. The resolution in meters per pixel. Defaults to the native resolution of the image.
Numeric. The maximum allowed number of pixels in the exported image. The task will fail if the exported region covers more pixels in the specified projection. Defaults to 100,000,000.
Logical. If TRUE, a
future::sequential
object is created to evaluate the task in the future.
See details.
Logical. If TRUE, a public link to the image will be created.
Add metadata to the stars_proxy object. See details.
Logical. Add current date and time (Sys.time()
) as
a prefix to files to export. This parameter helps to avoid exported files
with the same name. By default TRUE.
Logical. Suppress info message
Extra exporting argument. See ee_image_to_drive and
If add_metadata is FALSE, a character vector containing the filename of the images downloaded. Otherwise a list adding information related to the exportation (see details).
ee_imagecollection_to_local
supports the download of ee$Images
by two different options: "drive"
(Google Drive) and "gcs"
(
Google Cloud Storage). In both cases ee_imagecollection_to_local
works as follow:
1. A task will be started (i.e. ee$batch$Task$start()
) to
move the ee$Image
from Earth Engine to the intermediate container
specified in argument via
.
2. If the argument lazy
is TRUE, the task will not be
monitored. This is useful to lunch several tasks at the same time and
call them later using ee_utils_future_value
or
future::value
. At the end of this step,
the ee$Images
will be stored on the path specified in the argument
dsn
.
3. Finally if the argument add_metadata
is TRUE, a list
with the following elements will be added to the argument dsn
.
if via is "drive":
ee_id: Name of the Earth Engine task.
drive_name: Name of the Image in Google Drive.
drive_id: Id of the Image in Google Drive.
drive_download_link: Download link to the image.
if via is "gcs":
ee_id: Name of the Earth Engine task.
gcs_name: Name of the Image in Google Cloud Storage.
gcs_bucket: Name of the bucket.
gcs_fileFormat: Format of the image.
gcs_public_link: Download link to the image.
gcs_URI: gs:// link to the image.
For getting more information about exporting data from Earth Engine, take a look at the Google Earth Engine Guide - Export data.
Other image download functions:
ee_as_raster()
,
ee_as_stars()
,
ee_as_thumbnail()
# NOT RUN {
library(rgee)
library(raster)
ee_Initialize(drive = TRUE, gcs = TRUE)
# USDA example
loc <- ee$Geometry$Point(-99.2222, 46.7816)
collection <- ee$ImageCollection('USDA/NAIP/DOQQ')$
filterBounds(loc)$
filterDate('2008-01-01', '2020-01-01')$
filter(ee$Filter$listContains("system:band_names", "N"))
# From ImageCollection to local directory
ee_crs <- collection$first()$projection()$getInfo()$crs
geometry <- collection$first()$geometry(proj = ee_crs)$bounds()
tmp <- tempdir()
## Using drive
# one by once
ic_drive_files_1 <- ee_imagecollection_to_local(
ic = collection,
region = geometry,
scale = 250,
dsn = file.path(tmp, "drive_")
)
# all at once
ic_drive_files_2 <- ee_imagecollection_to_local(
ic = collection,
region = geometry,
scale = 250,
lazy = TRUE,
dsn = file.path(tmp, "drive_")
)
# From Google Drive to client-side
doqq_dsn <- ic_drive_files_2 %>% ee_utils_future_value()
sapply(doqq_dsn, '[[', 1)
# }
Run the code above in your browser using DataLab