mgcv (version 1.3-22)

pcls: Penalized Constrained Least Squares Fitting

Description

Solves least squares problems with quadratic penalties subject to linear equality and inequality constraints using quadratic programming.

Usage

pcls(M)

Arguments

M
is the single list argument to pcls. It should have the following elements:
  • y
{The response data vector.} w{A vector of weights for the data (often proportional to the reciprocal of the variance)

Value

  • The function returns an array containing the estimated parameter vector.

Details

This solves the problem: $$minimise~ \| { \bf W}^{1/2} ({ \bf Xp - y} ) \|^2 + \sum_{i=1}^m \lambda_i {\bf p^\prime S}_i{\bf p}$$ subject to constraints ${\bf Cp}={\bf c}$ and ${\bf A}_{in}{\bf p}>{\bf b}_{in}$, w.r.t. $\bf p$ given the smoothing parameters $\lambda_i$. ${\bf X}$ is a design matrix, $\bf p$ a parameter vector, $\bf y$ a data vector, $\bf W$ a diagonal weight matrix, ${\bf S}_i$ a positive semi-definite matrix of coefficients defining the ith penalty and $\bf C$ a matrix of coefficients defining the linear equality constraints on the problem. The smoothing parameters are the $\lambda_i$. Note that ${\bf X}$ must be of full column rank, at least when projected into the null space of any equality constraints. ${\bf A}_{in}$ is a matrix of coefficients defining the inequality constraints, while ${\bf b}_{in}$ is a vector involved in defining the inequality constraints.

Quadratic programming is used to perform the solution. The method used is designed for maximum stability with least squares problems: i.e. ${\bf X}^\prime {\bf X}$ is not formed explicitly. See Gill et al. 1981.

References

Gill, P.E., Murray, W. and Wright, M.H. (1981) Practical Optimization. Academic Press, London.

Wood, S.N. (1994) Monotonic smoothing splines fitted by cross validation SIAM Journal on Scientific Computing 15(5):1126-1133

http://www.maths.bath.ac.uk/~sw283/

See Also

mgcv mono.con

Examples

Run this code
# first an un-penalized example - fit E(y)=a+bx subject to a>0
set.seed(0)
n<-100
x<-runif(n);y<-x-0.2+rnorm(n)*0.1
M<-list(X=matrix(0,n,2),p=c(0.1,0.5),off=array(0,0),S=list(),
Ain=matrix(0,1,2),bin=0,C=matrix(0,0,0),sp=array(0,0),y=y,w=y*0+1)
M$X[,1]<-1;M$X[,2]<-x;M$Ain[1,]<-c(1,0)
pcls(M)->M$p
plot(x,y);abline(M$p,col=2);abline(coef(lm(y~x)),col=3)

# Penalized example: monotonic penalized regression spline .....

# Generate data from a monotonic truth.
x<-runif(100)*4-1;x<-sort(x);
f<-exp(4*x)/(1+exp(4*x));y<-f+rnorm(100)*0.1;plot(x,y)
dat<-data.frame(x=x,y=y)
# Show regular spline fit (and save fitted object)
f.ug<-gam(y~s(x,k=10,bs="cr"));lines(x,fitted(f.ug))
# Create Design matrix, constraints etc. for monotonic spline....
sm<-smoothCon(s(x,k=10,bs="cr"),dat,knots=NULL)
F<-mono.con(sm$xp);   # get constraints
G<-list(X=sm$X,C=matrix(0,0,0),sp=f.ug$sp,p=sm$xp,y=y,w=y*0+1)
G$Ain<-F$A;G$bin<-F$b;G$S<-sm$S;G$off<-0

p<-pcls(G);  # fit spline (using s.p. from unconstrained fit)

fv<-Predict.matrix(sm,data.frame(x=x))%*%p
lines(x,fv,col=2)

# now a tprs example of the same thing....

f.ug<-gam(y~s(x,k=10));lines(x,fitted(f.ug))
# Create Design matrix, constriants etc. for monotonic spline....
sm<-smoothCon(s(x,k=10,bs="tp"),dat,knots=NULL)       
xc<-0:39/39 # points on [0,1]  
nc<-length(xc)  # number of constraints
xc<-xc*4-1  # points at which to impose constraints
A0<-Predict.matrix(sm,data.frame(x=xc)) 
# ... A0%*%p evaluates spline at xc points
A1<-Predict.matrix(sm,data.frame(x=xc+1e-6)) 
A<-(A1-A0)/1e-6    
# ... approx. constraint matrix (A\%*\%p is -ve spline gradient at points xc)
G<-list(X=sm$X,C=matrix(0,0,0),sp=f.ug$sp,y=y,w=y*0+1,S=sm$S,off=0)
G$Ain<-A;    # constraint matrix
G$bin<-rep(0,nc);  # constraint vector
G$p<-rep(0,10);G$p[10]<-0.1  
# ... monotonic start params, got by setting coefs of polynomial part
p<-pcls(G);  # fit spline (using s.p. from unconstrained fit)

fv2<-Predict.matrix(sm,data.frame(x=x))%*%p
lines(x,fv2,col=3)

Run the code above in your browser using DataLab