gafit( target, start, thermal=0.1, maxiter=50, samples=10, step=1e-3, tolerance=-Inf )
.Random.seed
variable and try again.The results of this genetic algorithm may be used as a starting point for the nls regression algorithm (which will follow the gradient to the local optimum) so that a ``nearly right'' fit can be converted into a ``best'' fit. Often this chaining of regression algorithms requires that some deliberate error is introduced into the parameters because nls might complain about a singular gradient matrix (thinks... does nls attempt to narrow the step size for the numerical derivative when confronted by a singular gradient matrix? maybe it should).
expression
, nls
, .Random.seed
# Single parameter, all real numbers (not using least squares)
e <- expression( cos( theta ) + sin( theta ))
guess.1 <- list( theta=3 )
guess.2 <- gafit( e, guess.1, step=1e-3 ) # First attempt with thermal noise
gafit( e, guess.2, step=1e-5, thermal=0 ) # usually gets close to 3.926991
# Double parameter, complex numbers (least square curve fit)
sumsq <- function( x ) { sum(( Mod( x )) ^ 2 )}
freq <- exp( 1:15 )
tpj <- 2 * pi * (0+1i)
data <- 1 / ( 10 + tpj * freq * 1e-3 )
e <- expression( sumsq( 1 / ( R + tpj * freq * C ) - data ))
guess.1 <- list( R=100, C=1e-6 );
guess.2 <- gafit( e, guess.1, step=0.1, maxiter=100, tolerance=1e-2 )
gafit( e, thermal=0, guess.2, step=1e-3, maxiter=200, tolerance=1e-5 )
Run the code above in your browser using DataLab