Learn R Programming

pauwels2014 (version 1.0)

BFGS_special: An implementation of BFGS method for posterior maximization.

Description

Gradients are computed using finite differences.

Usage

BFGS_special(init, knobj, fun_like, verbose = FALSE)

Arguments

init
An initial value of the parameter to be optimized.
knobj
A knowledge list. See knobjs.
fun_like
A function to compute posterior value. See eval_log_like_knobj
verbose
Print progresses of the local search?

Value

  • A list with the following entries:
  • thetaThe local optimum found by the method.
  • failA boolean representing wither the local search failed or not due to numerical problems.

Details

The step size are chosen using Armijo's rule. Special checks are performed to avoid numerical instabilities in the differential equation solver.

Examples

Run this code
data(experiment_list1)
data(observables)

## Generate the knowledge object with correct parameter value
knobj <- generate_our_knowledge(transform_params)

## Initialize with some data
knobj$datas[[1]] <- list(
 manip = experiment_list1$nothing,
 data = add_noise(
  simulate_experiment(knobj$global_parameters$true_params_T, knobj, experiment_list1$nothing)[
   knobj$global_parameters$tspan %in% observables[["mrnaLow"]]$reso, 
   observables[["mrnaLow"]]$obs
  ]
 )
)
knobj$experiments <- paste("nothing", "mrnaLow")

theta <- rep( 50, length(knobj$global_parameters$param_names) )
names(theta) <- knobj$global_parameters$param_names

## Only perform 5 iterations
knobj$global_parameters$max_it <- 5

temp <- BFGS_special(theta, knobj, eval_log_like_knobj)
temp$theta

Run the code above in your browser using DataLab