Learn R Programming

gdalcubes (version 0.3.1)

zonal_statistics: Query summary statistics of data cube values over polygons

Description

This function will overlay spatial polygons with a data cube and compute summary statistics of pixel values within the polygons over all time slices.

Usage

zonal_statistics(
  x,
  geom,
  expr,
  out_path = tempfile(fileext = ".gpkg"),
  overwrite = FALSE,
  ogr_layer = NULL,
  as_stars = FALSE
)

Arguments

x

source data cube

geom

Either an sf object, or a path to an OGR dataset (Shapefile, GeoPackage, or similar) with input (multi)polygon geometries

expr

character vector of summary statistics expressions, describing pairs of aggregation functions and data cube bands (e.g. "mean(band1)")

out_path

path to where resulting GeoPackage will be written to

overwrite

logical; overwrite out_path if file already exists, defaults to FALSE

ogr_layer

If the input OGR dataset has multiple layers, a layer can be chosen by name

as_stars

logical; if TRUE, the created gpkg file will be loaded as a stars vector data cube

Value

character length-one vector containing the path to the resulting GeoPackage file (see Details) or a stars object (if as_stars is TRUE)

Details

The function creates a single GeoPackage output file containing:

  • A single layer "geom" containing the geometries (and feature identifiers) only.

  • Attribute tables (layers without geometry) for each time slice of the data cube containing summary statistics as columns. Corresponding layer names start with "attr_", followed by date and time.

  • Virtual spatial views for each time slice, joining the geometries and attribute tables. Corresponding layer names start with "map_", followed by date and time.

You will most-likely want to use the spatial view layers directly e.g. with the sf package.

Available summary statistics currently include "min", "max", "mean", "median", "count", "sum", "prod", "var", and "sd".

Examples

Run this code
# NOT RUN {
# if not already done in other examples
if (!file.exists(file.path(tempdir(), "L8.db"))) {
  L8_files <- list.files(system.file("L8NY18", package = "gdalcubes"),
                         ".TIF", recursive = TRUE, full.names = TRUE)
  create_image_collection(L8_files, "L8_L1TP", file.path(tempdir(), "L8.db"))
}
L8.col = image_collection(file.path(tempdir(), "L8.db"))
v = cube_view(srs="EPSG:32618", dy=300, dx=300, dt="P1M", 
              aggregation = "median", resampling = "bilinear",
              extent=list(left=388941.2, right=766552.4,
                          bottom=4345299, top=4744931, 
                          t0="2018-01-01", t1="2018-04-30"))
L8.cube = raster_cube(L8.col, v) 
L8.cube = select_bands(L8.cube, c("B04", "B05")) 
L8.ndvi = apply_pixel(L8.cube, "(B05-B04)/(B05+B04)", "NDVI") 
L8.ndvi

# toy example: overlay NDVI data with NYC districts
if (gdalcubes_gdal_has_geos()) {
  x = zonal_statistics(L8.ndvi, system.file("nycd.gpkg", package = "gdalcubes"),
                       expr = "median(NDVI)")
  x
}

# }

Run the code above in your browser using DataLab