# constrOptim

##### Linearly Constrained Optimization

Minimise a function subject to linear inequality constraints using an adaptive barrier algorithm.

- Keywords
- optimize

##### Usage

`constrOptim(theta, f, grad, ui, ci, mu = 1e-04, control = list(), method = if(is.null(grad)) "Nelder-Mead" else "BFGS", outer.iterations = 100, outer.eps = 1e-05, ..., hessian = FALSE)`

##### Arguments

- theta
- numeric (vector) starting value (of length $p$): must be in the feasible region.
- f
- function to minimise (see below).
- grad
- gradient of
`f`

(a`function`

as well), or`NULL`

(see below). - ui
- constraint matrix ($k x p$), see below.
- ci
- constraint vector of length $k$ (see below).
- mu
- (Small) tuning parameter.
- control, method, hessian
- passed to
`optim`

. - outer.iterations
- iterations of the barrier algorithm.
- outer.eps
- non-negative number; the relative convergence tolerance of the barrier algorithm.
- ...
- Other named arguments to be passed to
`f`

and`grad`

: needs to be passed through`optim`

so should not match its argument names.

##### Details

The feasible region is defined by `ui %*% theta - ci >= 0`

. The
starting value must be in the interior of the feasible region, but the
minimum may be on the boundary.

A logarithmic barrier is added to enforce the constraints and then
`optim`

is called. The barrier function is chosen so that
the objective function should decrease at each outer iteration. Minima
in the interior of the feasible region are typically found quite
quickly, but a substantial number of outer iterations may be needed
for a minimum on the boundary.

The tuning parameter `mu`

multiplies the barrier term. Its precise
value is often relatively unimportant. As `mu`

increases the
augmented objective function becomes closer to the original objective
function but also less smooth near the boundary of the feasible
region.

Any `optim`

method that permits infinite values for the
objective function may be used (currently all but "L-BFGS-B").

The objective function `f`

takes as first argument the vector
of parameters over which minimisation is to take place. It should
return a scalar result. Optional arguments `...`

will be
passed to `optim`

and then (if not used by `optim`

) to
`f`

. As with `optim`

, the default is to minimise, but
maximisation can be performed by setting `control$fnscale`

to a
negative value.

The gradient function `grad`

must be supplied except with
`method = "Nelder-Mead"`

. It should take arguments matching
those of `f`

and return a vector containing the gradient.

##### Value

##### References

K. Lange *Numerical Analysis for Statisticians.* Springer
2001, p185ff

##### See Also

`optim`

, especially `method = "L-BFGS-B"`

which
does box-constrained optimisation.

*Documentation reproduced from package stats, version 3.2.5, License: Part of R 3.2.5*