Unlimited learning, half price | 50% off

Last chance! 50% off unlimited learning

Sale ends in


skedastic (version 2.0.2)

GSS: Golden Section Search for Minimising Univariate Function over a Closed Interval

Description

Golden Section Search (GSS) is a useful algorithm for minimising a continuous univariate function f(x) over an interval [a,b] in instances where the first derivative f(x) is not easily obtainable, such as with loss functions that need to be minimised under cross-validation to tune a hyperparameter in a machine learning model. The method is described by Fox21;textualskedastic.

Usage

GSS(f, a, b, tol = 1e-08, maxitgss = 100L, ...)

Value

A list object containing the following:

  • argmin, the argument of f that minimises f

  • funmin, the minimum value of f achieved at argmin

  • converged, a logical indicating whether the convergence tolerance was satisfied

  • iterations, an integer indicating the number of search iterations used

Arguments

f

A function of one variable that returns a numeric value

a

A numeric of length 1 representing the lower endpoint of the search interval

b

A numeric of length 1 representing the upper endpoint of the search interval; must be greater than a

tol

A numeric of length 1 representing the tolerance used to determine when the search interval has been narrowed sufficiently for convergence

maxitgss

An integer of length 1 representing the maximum number of iterations to use in the search

...

Additional arguments to pass to f

Details

This function is modelled after a MATLAB function written by Zarnowiec22;textualskedastic. The method assumes that there is one local minimum in this interval. The solution produced is also an interval, the width of which can be made arbitrarily small by setting a tolerance value. Since the desired solution is a single point, the midpoint of the final interval can be taken as the best approximation to the solution.

The premise of the method is to shorten the interval [a,b] by comparing the values of the function at two test points, x1<x2, where x1=a+r(ba) and x2=br(ba), and r=(51)/20.618 is the reciprocal of the golden ratio. One compares f(x1) to f(x2) and updates the search interval [a,b] as follows:

  • If f(x1)<f(x2), the solution cannot lie in [x2,b]; thus, update the search interval to [anew,bnew]=[a,x2]

  • If f(x1)>f(x2), the solution cannot lie in [a,x1]; thus, update the search interval to [anew,bnew]=[x1,b]

One then chooses two new test points by replacing a and b with anew and bnew and recalculating x1 and x2 based on these new endpoints. One continues iterating in this fashion until ba<τ, where τ is the desired tolerance.

References

See Also

goldsect is similar to this function, but does not allow the user to pass additional arguments to f

Examples

Run this code
f <- function(x) (x - 1) ^ 2
GSS(f, a = 0, b = 2)

Run the code above in your browser using DataLab