Learn R Programming

sits (version 0.13.0)

sits_cube: Defines a data cube

Description

Creates a data cube based on spatial and temporal restrictions on a collection available in repositories such as AWS, Brazil Data Cube (BDC), and Digital Earth Africa (DEA) using information provided by STAC endpoints. Users can also create data cubes from local files.

A data cube does not contain actual data; it points to the files where the required data is archived. Other functions (e.g. `sits_classify`) use that information to retrieve and process data.

Currently, users can create data cube from the following sources:

  • "BDC": Brazil Data Cube (BDC), see also https://brazil-data-cube.github.io/applications/stac.html

  • "WTSS": Web Time Series Service from BDC, see also https://brazil-data-cube.github.io/applications/wtss.html

  • "DEAFRICA": Digital Earth Africa, see also https://www.digitalearthafrica.org/

  • "AWS": Amazon Web Services (AWS), see also https://earth-search.aws.element84.com/v0/

  • "LOCAL": Defines a cube from on a set of local files.

  • "PROBS": Defines a cube to from a set of classified image files.

  • "SATVEG": Defines a cube to use the SATVEG web service, see also https://www.satveg.cnptia.embrapa.br/satveg/login.html

For big data sources such as AWS, BDC, and DEA users need to provide:

  • collection: Collections are the highest level of aggregation on bug data repositories. Each repository has its own set of collections, described by STAC. To use STAC for querying repositories, please see the package `rstac` for examples.

  • spatial extent: The spatial extent of the data cube can be defined in two ways: (a) a region of interest(`bbox`) in WGS 84 coordinates; (b) a set of tiles defined according the collection tiling system.

  • temporal extent: The start and end date of the cube

Usage

sits_cube(source, ...)

# S3 method for wtss_cube sits_cube(source = "WTSS", ..., name = "wtss_cube", url = NULL, collection)

# S3 method for bdc_cube sits_cube( source = "BDC", ..., name = "bdc_cube", collection, bands = NULL, tiles = NULL, bbox = NULL, start_date = NULL, end_date = NULL )

# S3 method for deafrica_cube sits_cube( source = "DEAFRICA", ..., name = "deafrica_cube", url = NULL, collection = "s2_l2a", bands = NULL, tiles = NULL, bbox = NULL, start_date = NULL, end_date = NULL )

# S3 method for aws_cube sits_cube( source = "AWS", ..., name = "aws_cube", url = NULL, collection = "sentinel-s2-l2a", tiles = NULL, bands = NULL, bbox = NULL, s2_resolution = 20, start_date = NULL, end_date = NULL )

# S3 method for usgs_cube sits_cube( source = "USGS", ..., name = "usgs_cube", url = NULL, collection = "landsat-c2l2-sr", tiles = NULL, bands = NULL, bbox = NULL, start_date = NULL, end_date = NULL )

# S3 method for local_cube sits_cube( source = "LOCAL", ..., name = "local_cube", satellite, sensor, bands = NULL, start_date = NULL, end_date = NULL, data_dir, parse_info = c("X1", "X2", "tile", "band", "date"), delim = "_" )

# S3 method for probs_cube sits_cube( source = "PROBS", ..., name = "probs_cube", satellite, sensor, start_date, end_date, probs_labels, probs_files )

# S3 method for satveg_cube sits_cube(source = "SATVEG", ..., collection = "terra")

Arguments

source

Data source (one of "AWS", "BDC", "DEAFRICA", "LOCAL", "PROBS", "SATVEG", "USGS", "WTSS").

...

Other parameters to be passed for specific types.

name

Name of the output data cube.

url

URL for the STAC endpoint of the data source.

collection

Collection to be searched in the data source.

bands

Bands to be included.

tiles

Tiles from the collection to be included in the data cube.

bbox

Area of interest (see details below).

start_date

Initial date for the cube (optional).

end_date

Final date for the cube (optional).

s2_resolution

Resolution of S2 images ("10m", "20m" or "60m"). used to build cubes (only for AWS cubes).

satellite

Satellite that produced the images (only for creating data cubes from local files).

sensor

Sensor that produced the images.

data_dir

directory where local data is located (only for creating data cubes from local files).

parse_info

parsing information for files without STAC information (only for creating data cubes from local files).

delim

delimiter for parsing files without STAC information (only for creating data cubes from local files).

probs_labels

Labels associated to a probabilities cube.

probs_files

File names (used for creating a cube from probabilities).

Value

The description of a data cube

Details

The bbox parameter allows a selection of an area of interest. Either using a named vector ("xmin", "ymin", "xmax", "ymax") with values in WGS 84, a sfc or sf object from sf package, or a GeoJSON geometry (RFC 7946). Note that this parameter does not crop a region, but only selects the images that intersect with it.

References

`rstac` package (https://github.com/brazil-data-cube/rstac)

Examples

Run this code
# NOT RUN {
-=-

# --- Access to the Brazil Data Cube
# Provide your BDC credentials as environment variables
Sys.setenv(
    "BDC_ACCESS_KEY" = <your_bdc_access_key>
)

# create a raster cube file based on the information in the BDC
cbers_tile <- sits_cube(
    source = "BDC",
    collection = "CB4_64_16D_STK-1",
    name = "cbers_022024",
    bands = c("NDVI", "EVI"),
    tiles = "022024",
    start_date = "2018-09-01",
    end_date = "2019-08-28"
)

# --- Create a WTSS cube from BDC cubes
# Provide your BDC credentials as environment variables
Sys.setenv(
    "BDC_ACCESS_KEY" = <your_bdc_access_key>
)

cube_wtss <- sits::sits_cube(source = "WTSS",
                             collection = "MOD13Q1-6")

# --- Access to Digital Earth Africa
# Provide your AWS credentials as environment variables
Sys.setenv(
    "AWS_ACCESS_KEY_ID" = <your_aws_access_key>,
    "AWS_SECRET_ACCESS_KEY" = <your_aws_secret_access_key>
)

# create a raster cube file based on the information about the files
cube_dea <- sits_cube(source = "DEAFRICA",
                      name = "deafrica_cube",
                      collection = "s2_l2a",
                      bands = c("B04", "B08"),
                      bbox = c("xmin" = 17.379,
                              "ymin" = 1.1573,
                              "xmax" = 17.410,
                               "ymax" = 1.1910),
                      start_date = "2019-01-01",
                      end_date = "2019-10-28")

# --- Access to Sentinel 2/2A level 2 data in AWS
# Provide your AWS credentials as environment variables
Sys.setenv(
    "AWS_ACCESS_KEY_ID" = <your_aws_access_key>,
    "AWS_SECRET_ACCESS_KEY" = <your_aws_secret_access_key>
)

s2_cube <- sits_cube(source = "AWS",
                      name = "T20LKP_2018_2019",
                      collection = "sentinel-s2-l2a",
                      tiles = c("20LKP","20LLP"),
                      start_date = as.Date("2018-07-18"),
                      end_date = as.Date("2018-07-23"),
                      s2_resolution = 20
)

# --- Create a cube based on a stack of CBERS data
data_dir <- system.file("extdata/raster/cbers", package = "sits")

cbers_cube <- sits_cube(
    source = "LOCAL",
    name = "022024",
    satellite = "CBERS-4",
    sensor = "AWFI",
    data_dir = data_dir,
    delim = "_",
    parse_info = c("X1", "X2", "tile", "band", "date")
)

# Create a raster cube based on files with probability information
# inform the files that make up a raster probs brick with 23 time instances
probs_file <- c(system.file(
    "extdata/raster/probs/sinop-2014_probs_2013_9_2014_8_v1.tif",
    package = "sits"
))

# inform the labels
labels <- c("Cerrado", "Fallow_Cotton", "Forest", "Pasture", "Soy_Corn",
"Soy_Cotton", "Soy_Fallow", "Soy_Millet", "Soy_Sunflower")


# create a raster cube file based on the information about the files
probs_cube <- sits_cube(
    source = "PROBS",
    name = "Sinop-crop-probs",
    satellite = "TERRA",
    sensor  = "MODIS",
    start_date = as.Date("2013-09-14"),
    end_date = as.Date("2014-08-29"),
    probs_labels = labels,
    probs_files = probs_file
)
# }
# NOT RUN {
# }

Run the code above in your browser using DataLab