50% off: Unlimited data and AI learning.
State of Data and AI Literacy Report 2025

Laurae (version 0.0.0.9001)

loss_LKL_xgb: Laurae's Kullback-Leibler Error (xgboost function)

Description

This function computes for xgboost's obj function the Laurae's Kullback-Leibler Error loss gradient and hessian per value provided preds and dtrain.

Usage

loss_LKL_xgb(preds, dtrain)

Arguments

preds
The predictions.
dtrain
The xgboost model.

Value

The gradient and the hessian of the Laurae's Kullback-Leibler Error per value in a list.

Details

This loss function is strictly positive, therefore defined in \]0, +Inf\[. It penalizes lower values more heavily, and as such is a good fit for typical problems requiring fine tuning when undercommitting on the predictions. Compared to Laurae's Poisson loss function, Laurae's Kullback-Leibler loss has much higher loss. Negative and null values are set to 1e-15. This loss function is experimental. Loss Formula : (ytrueypred)log(ytrue/ypred) Gradient Formula : ((ytrueypred)/ypred+log(ytrue)log(ypred)) Hessian Formula : ((ytrueypred)/ypred+2)/ypred