Learn R Programming

Rmpfr (version 0.5-3)

hjkMpfr: Hooke-Jeeves Derivative-Free Minimization R (working for MPFR)

Description

An implementation of the Hooke-Jeeves algorithm for derivative-free optimization. This is a slight adaption hjk() from package dfoptim

Usage

hjkMpfr(par, fn, control = list(), ...)

Arguments

par
Starting vector of parameter values. The initial vector may lie on the boundary. If lower[i]=upper[i] for some i, the i-th component of the solution vector will simply be kept fixed.
fn
Nonlinear objective function that is to be optimized. A scalar function that takes a real vector as argument and returns a scalar that is the value of the function at that point.
control
list of control parameters. See Details for more information.
...
Additional arguments passed to fn.

Value

  • A list with the following components:
  • parBest estimate of the parameter vector found by the algorithm.
  • valuevalue of the objective function at termination.
  • convergenceindicates convergence (TRUE) or not (FALSE).
  • fevalnumber of times the objective fn was evaluated.
  • niternumber of iterations (steps) in the main loop.

Details

Argument control is a list specifing changes to default values of algorithm control parameters. Note that parameter names may be abbreviated as long as they are unique.

The list items are as follows: [object Object],[object Object],[object Object],[object Object],[object Object] If the minimization process threatens to go into an infinite loop, set either maxfeval or target.

References

C.T. Kelley (1999), Iterative Methods for Optimization, SIAM.

Quarteroni, Sacco, and Saleri (2007), Numerical Mathematics, Springer.

See Also

Standard R's optim; optimizeR provides one-dimensional minimization methods that work with mpfr-class numbers.

Examples

Run this code
## simple smooth example:
ff <- function(x) sum((x - c(2:4))^2)
str(rr <- hjkMpfr(rep(mpfr(0,128), 3), ff, control=list(info=TRUE)))


## Hooke-Jeeves solves high-dim. Rosenbrock function  {but slowly!}
rosenbrock <- function(x) {
    n <- length(x)
    sum (100*((x1 <- x[1:(n-1)])^2 - x[2:n])^2 + (x1 - 1)^2)
}

par0 <- rep(0, 10)
str(rb.db <- hjkMpfr(rep(0, 10), rosenbrock, control=list(info=TRUE)))
## rosenbrook() is quite slow with mpfr-numbers:
str(rb.M. <- hjkMpfr(mpfr(numeric(10), prec=128), rosenbrock,
                     control = list(tol = 1e-8, info=TRUE)))
##  Hooke-Jeeves does not work well on non-smooth functions
nsf <- function(x) {
  f1 <- x[1]^2 + x[2]^2
  f2 <- x[1]^2 + x[2]^2 + 10 * (-4*x[1] - x[2] + 4)
  f3 <- x[1]^2 + x[2]^2 + 10 * (-x[1] - 2*x[2] + 6)
  max(f1, f2, f3)
}
par0 <- c(1, 1) # true min 7.2 at (1.2, 2.4)
h.d <- hjkMpfr(par0,            nsf) # fmin=8 at xmin=(2,2)
## and this is not at all better (but slower!)
h.M <- hjkMpfr(mpfr(c(1,1), 128), nsf, control = list(tol = 1e-15))## --> demo(hjkMpfr) # -> Fletcher's chebyquad function m = n -- residuals

Run the code above in your browser using DataLab