Options for penalty setup in the pre-processing
penalty_control(
defaultSmoothing = NULL,
df = 10,
null_space_penalty = FALSE,
absorb_cons = FALSE,
anisotropic = TRUE,
zero_constraint_for_smooths = TRUE,
no_linear_trend_for_smooths = FALSE,
hat1 = FALSE,
sp_scale = function(x) ifelse(is.list(x) | is.data.frame(x), 1/NROW(x[[1]]), 1/NROW(x))
)
Returns a list with options
function applied to all s-terms, per default (NULL)
the minimum df of all possible terms is used. Must be a function the smooth term
from mgcv's smoothCon and an argument df
.
degrees of freedom for all non-linear structural terms (default = 7);
either one common value or a list of the same length as number of parameters;
if different df values need to be assigned to different smooth terms,
use df as an argument for s()
, te()
or ti()
logical value;
if TRUE, the null space will also be penalized for smooth effects.
Per default, this is equal to the value give in variational
.
logical; adds identifiability constraint to the basis.
See ?mgcv::smoothCon
for more details.
whether or not use anisotropic smoothing (default is TRUE)
logical; the same as absorb_cons,
but done explicitly. If true a constraint is put on each smooth to have zero mean. Can
be a vector of length(list_of_formulas)
for each distribution parameter.
logical; see zero_constraint_for_smooths
, but
this removes the linear trend from splines
logical; if TRUE, the smoothing parameter is defined by the trace of the hat matrix sum(diag(H)), else sum(diag(2*H-HH))
function of response; for scaling the penalty (1/n per default)