Renouv(x,
threshold = NULL,
effDuration = NULL,
distname.y = "exponential",
MAX.data = NULL,
MAX.effDuration = NULL,
OTS.data = NULL,
OTS.effDuration = NULL,
OTS.threshold = NULL,
fixed.par.y = NULL,
start.par.y = NULL,
force.start.H = FALSE,
numDeriv = TRUE,
trans.y = NULL,
jitter.KS = TRUE,
pct.conf = c(95, 70),
rl.prob = NULL,
prob.max = 1.0-1e-04 ,
pred.period = NULL,
suspend.warnings = TRUE,
control = list(maxit = 300, fnscale = -1),
control.H = list(maxit = 300, fnscale = -1),
trace = 0,
plot = TRUE,
...)"Rendata".
In the first case, x contains all the levels above the
threshold for a variable of interest.
In the second case, most formal arguments take values in accordy-distribution.y-distribution. Only used when the distribution does not
belong to the list of special distributions.TRUE, the values in start.par.y (which
must then be correct) will be used also as
starting values in the maximisation of the global likelihood : OT data
and historical data. This is useful e.g. when the numDeriv package (value TRUE) or should it be taken
from the results of optim?NULL). This is only possible with the "exponential"
value distname.y. The two allowed choices are "square"
and "log"TRUE, a small amount of
noise is added to the "OT" data used in the Kolmogorov-Smirnov test
in order to remove ties. This is done using the OTjitter function.ret.lev.prob vector: values > prob.max in the default
prob vector are omitted. Ignored when a pred
data.frame.TRUE, the warnings will be suspended
during optimisation steps. This is useful when the parameters
are subject to constraints as is usually the case.optim for the
no-history stage (if any). Note that fnscale = -1 says that
maximisation is required (not minimisation) and must not be changed!optim for the historical stage (if any).0 prints nothing.plot.Renouv, e.g. main,
ylim."Renouv". This is mainly a list with the various
results."N" part. This estimate
does not use the historical data, even if is available."y" part. This
estimate does not use the historical data, even if available.ret.lev, but with
"pretty" return periods. These are taken as the provided values
pred.period if any or are chosen as "round" multiples of the
time unit (taken from effDuration). The periods are chosen in order
to cover periods ranging from 1/10 to 10 time units.names(result) to see their
list.
Except in the the special case where distname.y is
"exponential" and where no historical data are used, the
inference on quantiles is obtained with the delta method and
using numerical derivatives. Confidence limits are unreliable for
return levels much greater than the observation-historical period.
Due to the presence of estimated parameters, the Kolmogorov-Smirnov
test is unreliable when less than 30 observations are available. A possible solution is then to rescale the data e.g. dividing
them by 10 or 100. As a rule of thumb, an
acceptable scaling leads to data (exceedances) of a few units to a few
hundreds, but an order of magnitude of thousands or more should be
avoided and reduced by scaling. The rescaling is recommanded for the square
exponential distribution (obtained with trans = "square") since
the observations are squared.
Another possible way to solve the problem is to change the
numDeriv value.
This problem will be solved in a future version.
exponential rate
weibull shape, scale
gpd scale, shape
log-normal meanlog,
sdlog
gamma shape, scale
mixexp2 prob1, rate1, delta
}
Other distributions can be used. Because the
probability functions are then used in a "black-box" fashion, these
distributions should respect the following
formal requirements:
"d","p","q".
This rules applies for distribution of thestatspackage
and those of many other packages suchevd.x,qorpmust be
accepted by the density, the distribution and the quantile
functions.logformalargument. WhenlogisTRUE, the log-density is returned instead of
the density.start.par.y. The arguments list must have exactly the required
number of parameters for the family (e.g. 2 for gamma).
Some parameters can be fixed (known); then the parameter set will be
the reunion of those appearing in start.par.y
and those in fixed.par.y. Anyway, in the present version, at least one parameter
must be unknown for the y part of model.
Mathematical requirements exist for a correct use of ML. They
are refered to as "regularity conditions" in ML theory. Note that
the support of the distribution must be the set of non-negative
real numbers.
The estimation procedure differs according to the existence of
historical (MAX) data.
effDuration. The "Over the Threshold"
parameters are estimated from the exceedances computed asx.OTminus the threshold.distname.y = "gpd"when historical data exceedthreshold-scale/shapefor the values ofshapeandscalecomputed in the first stage.rRenouv to simulate "renouvellement" data,
RLplot for the return level plot. See
optim for the tuning of optimisation.library(Renext)
data(Garonne)
## use a "Rendata" object as 'x'. Historical data are used!
fit <- Renouv(x = Garonne, distname = "weibull", trace = 1,
main = "'Garonne' data")
summary(fit)
## generates a warning because of the ties
fit2 <- Renouv(x = Garonne, distname = "gpd",
jitter.KS = FALSE,
threshold = 2800, trace = 1,
main = "'Garonne' data with threshold = 2800 and GPD fit")
## use a numeric vector as 'x'
fit3 <-
Renouv(x = Garonne$OTdata$Flow,
threshold = 2500,
effDuration = 100,
distname = "gpd",
OTS.data = list(numeric(0), c(6800, 7200)),
OTS.effDuration = c(100, 150),
OTS.threshold = c(7000, 6000),
trace = 1,
main = "'Garonne' data with artificial "OTS" data")
## Add historical (fictive) data
fit4 <- Renouv(x = Garonne$OTdata$Flow,
threshold = 2500,
effDuration = 100,
distname = "weibull",
fixed.par.y = list(shape = 1.1),
OTS.data = list(numeric(0), c(6800, 7200)),
OTS.effDuration = c(100, 150),
OTS.threshold = c(7000, 6000),
trace = 0,
main = "'Garonne' data with artificial "OTS" data")Run the code above in your browser using DataLab