pracma (version 1.9.9)

steep_descent: Steepest Descent Minimization

Description

Function minimization by steepest descent.

Usage

steep_descent(x0, f, g = NULL, info = FALSE, maxiter = 100, tol = .Machine$double.eps^(1/2))

Arguments

x0
start value.
f
function to be minimized.
g
gradient function of f; if NULL, a numerical gradient will be calculated.
info
logical; shall information be printed on every iteration?
maxiter
max. number of iterations.
tol
relative tolerance, to be used as stopping rule.

Value

List with following components:

Details

Steepest descent is a line search method that moves along the downhill direction.

References

Nocedal, J., and S. J. Wright (2006). Numerical Optimization. Second Edition, Springer-Verlag, New York, pp. 22 ff.

See Also

fletcher_powell

Examples

Run this code
##  Rosenbrock function: The flat valley of the Rosenbruck function makes
##  it infeasible for a steepest descent approach.
# rosenbrock <- function(x) {
#     n <- length(x)
#     x1 <- x[2:n]
#     x2 <- x[1:(n-1)]
#     sum(100*(x1-x2^2)^2 + (1-x2)^2)
# }
# steep_descent(c(1, 1), rosenbrock)
# Warning message:
# In steep_descent(c(0, 0), rosenbrock) :
#   Maximum number of iterations reached -- not converged.

## Sphere function
sph <- function(x) sum(x^2)
steep_descent(rep(1, 10), sph)
# $xmin   0 0 0 0 0 0 0 0 0 0
# $fmin   0
# $niter  2

Run the code above in your browser using DataLab