A fitted GeDSboost object returned by the function NGeDSboost
,
inheriting the methods for class "GeDSboost"
. Methods for functions
coef
, knots
, plot
, print
, predict
,
visualize_boosting
, and bl_imp
are available.
extcall
call to the NGeDSboost
function.
formula
a formula object representing the model to be fitted.
args
a list containing the arguments passed to the NGeDSboost
function. This list includes:
response
: data.frame
containing the response variable
observations.
predictors
: data.frame
containing the observations
corresponding to the predictor variables included in the model.
base_learners
: description of model's base learners.
family
: the statistical family; the possible options are
mboost::Binomial(type = c("adaboost", "glm"),
link = c("logit", "probit", "cloglog", "cauchit", "log"), ...)
mboost::Gaussian()
mboost::Poisson()
mboost::GammaReg(nuirange = c(0, 100))
Other mboost
families may be suitable; however, these have not yet
been thoroughly tested and are therefore not recommended for use.
initial_learner
: if TRUE
a NGeDS
or
GGeDS
fit was used as the initial learner; otherwise, the
empirical risk minimizer corresponding to the selected family
was employed.
int.knots_init
: if initial_learner = TRUE
, this
corresponds to the maximum number of internal knots set in the
NGeDS
/GGeDS
function before the initial learner
fit.
shrinkage
: shrinkage/step-length/learning rate utilized
throughout the boosting iterations.
normalize_data
: if TRUE
, then response and predictors
were standardized before running the FGB algorithm.
X_mean
: mean of the predictor variables (only if
normalize_data = TRUE
, otherwise this is NULL
).
X_sd
: standard deviation of the predictors (only if
normalize_data = TRUE
, otherwise this is NULL
).
Y_mean
: mean of the response variable (only if
normalize_data = TRUE
, otherwise this is NULL
).
Y_sd
: standard deviation of the response variable (only if
normalize_data = TRUE
, otherwise this is NULL
).
models
A list containing the 'model' generated at each boosting
iteration. Each of these models
includes:
best_bl
: fit of the base learner that minimized the residual
sum of squares (RSS) in fitting the gradient at the i-th boosting
iteration.
Y_hat
: model fitted values at the i-th boosting
iteration.
base_learners
: knots and polynomial coefficients for each of the
base-learners at the i-th boosting iteration.
final_model
A list detailing the final GeDSboost model after the gradient descent algorithm is run:
model_name
: the boosting iteration corresponding to the final
model.
DEV
: deviance of the final model.
Y_hat
: fitted values.
base_learners
: a list containing, for each base-learner, the
intervals defined by the piecewise linear fit and its corresponding
polynomial coefficients. It also includes the knots corresponding to each
order fit, which result from computing the corresponding averaging knot
location. See Kaishev et al. (2016) for details. If the number of internal
knots of the final linear fit is less than $n-1$, the averaging knot location
is not computed.
Linear.Fit
/Quadratic.Fit
/Cubic.Fit
: final linear,
quadratic and cubic fits in B-spline form. These include the same elements
as Linear
, Quadratic
and Cubic
in a GeDS-class
object (see SplineReg
for details).
predictions
a list containing the predicted values obtained for each of the fits (linear, quadratic and cubic).
internal_knots
a list detailing the internal knots obtained for each of the different order fits (linear, quadratic, and cubic).
Dimitrova, D. S., Kaishev, V. K., Lattuada, A. and Verrall, R. J. (2023).
Geometrically designed variable knot splines in generalized (non-)linear
models.
Applied Mathematics and Computation, 436.
DOI: tools:::Rd_expr_doi("10.1016/j.amc.2022.127493")
Dimitrova, D. S., Guillen, E. S. and Kaishev, V. K. (2024). GeDS: An R Package for Regression, Generalized Additive Models and Functional Gradient Boosting, based on Geometrically Designed (GeD) Splines. Manuscript submitted for publication.