Learn R Programming

dfoptim (version 2011.7-2)

hjk: Hooke-Jeeves derivative-free minimization algorithm

Description

An implementation of the Hooke-Jeeves algorithm for derivative-free optimization.

Usage

hjk(par, fn, control = list(), ...)

Arguments

Value

A list with the following components:parBest estimate of the parameter vector found by the algorithm.valuevalue of the objective function at termination.convergenceindicates convergence (=0) or not (=1).fevalnumber of times the objective fn was evaluated.niternumber of iterations in the main loop.

Details

Argument control is a list specifing changes to default values of algorithm control parameters. Note that parameter names may be abbreviated as long as they are unique. The list items are as follows: [object Object],[object Object],[object Object],[object Object],[object Object] If the minimization process threatens to go into an infinite loop, set either maxfeval or target.

References

C.T. Kelley (1999), Iterative Methods for Optimization, SIAM. Quarteroni, Sacco, and Saleri (2007), Numerical Mathematics, Springer.

See Also

nmk

Examples

Run this code
##  Hooke-Jeeves solves high-dim. Rosenbrock function
  rosenbrock <- function(x){
    n <- length(x)
    sum (100*(x[1:(n-1)]^2 - x[2:n])^2 + (x[1:(n-1)] - 1)^2)
  }
par0 <- rep(0, 10)
hjk(par0, rosenbrock)

##  Hooke-Jeeves does not work well on non-smooth functions
  nsf <- function(x) {
	f1 <- x[1]^2 + x[2]^2
	f2 <- x[1]^2 + x[2]^2 + 10 * (-4*x[1] - x[2] + 4)
	f3 <- x[1]^2 + x[2]^2 + 10 * (-x[1] - 2*x[2] + 6)
	max(f1, f2, f3)
  }
par0 <- c(1, 1)                                 # true min 7.2 at (1.2, 2.4)
hjk(par0, nsf) # fmin=8 at xmin=(2,2)

Run the code above in your browser using DataLab