Looks for data that have been digitized twice by mistake. For sud-daily data,
this is done by looking for series of zero differences between adjacent
observation times. For daily data, by looking for series of zero differences
between the same days of adjacent months.
Usage
duplicate_columns(Data, meta = NULL, outpath, ndays = 5)
Arguments
Data
A character string giving the path of the input file,
or a matrix with 5 (7) columns for daily (sub-daily) data: variable code, year,
month, day, (hour), (minute), value.
meta
A character vector with 6 elements: station ID, latitude, longitude,
altitude, variable code, units. If Data is a path, meta is
ignored.
outpath
Character string giving the path for the QC results.
ndays
Number of consecutive days with zero difference required to
flag the data. The default is 5.
Author
Yuri Brugnara
Details
The input file must follow the Copernicus Station Exchange
Format (SEF). This function works with any numerical variable.
Zeroes are automatically excluded in bounded variables such as precipitation.