Learn R Programming

lgspline (version 0.2.0)

efficient_bfgs: BFGS Implementation for REML Parameter Estimation

Description

BFGS optimizer designed for REML optimization of correlation parameters. Combines function evaluation and gradient computation into single call to avoid redundant model refitting.

Usage

efficient_bfgs(par, fn, control = list())

Value

List containing:

par

Parameter vector minimizing objective

value

Minimum objective value

counts

Number of iterations

convergence

TRUE if converged within maxit

message

Description of termination status

vcov

Final approximation of inverse-Hessian, useful for inference

Arguments

par

Numeric vector of initial parameter values.

fn

Function returning list(objective, gradient). Must return both objective value and gradient vector matching length(par).

control

List of control parameters:

maxit

Maximum iterations, default 100

abstol

Absolute convergence tolerance, default sqrt(.Machine$double.eps)

reltol

Relative convergence tolerance, default sqrt(.Machine$double.eps)

initial_damp

Initial damping factor, default 1

min_damp

Minimum damping before termination, default 2^-10

trace

Print iteration progress, default FALSE

Details

Implements BFGS, used internally by lgspline() for optimizing correlation parameters via REML when argument for computing gradient VhalfInv_grad is not NULL.

This is more efficient than native BFGS, since gradient and loss can be computed simultaneously, avoiding re-computing components in "fn" and "gr" separately.

Examples

Run this code
# \donttest{

## Minimize Rosenbrock function
fn <- function(x) {
  # Objective
  f <- 100*(x[2] - x[1]^2)^2 + (1-x[1])^2
  # Gradient
  g <- c(-400*x[1]*(x[2] - x[1]^2) - 2*(1-x[1]),
         200*(x[2] - x[1]^2))
  list(f, g)
}
(res <- efficient_bfgs(c(0.5, 2.5), fn))

## Compare to
(res0 <- stats::optim(c(0.5, 2.5), function(x)fn(x)[[1]], hessian = TRUE))
solve(res0$hessian)
# }

Run the code above in your browser using DataLab