Learn R Programming

gplm (version 0.7-4)

kgplm: Generalized partial linear model

Description

Fits a generalized partial linear model (kernel-based) using the (generalized) Speckman estimator or backfitting (in the generalized case combined with local scoring) for two additive component functions.

Usage

kgplm(x, t, y, h, family, link, b.start=NULL, m.start=NULL, grid = NULL, offset = 0, method = "speckman", sort = TRUE, weights = 1, weights.trim = 1, weights.conv = 1, max.iter = 25, eps.conv = 1e-8, kernel = "biweight", kernel.product = TRUE, verbose = FALSE)

Arguments

x
n x p matrix, data for linear part
y
n x 1 vector, responses
t
n x q matrix, data for nonparametric part
h
scalar or 1 x q, bandwidth(s)
family
text string, family of distributions (e.g. "gaussian" or "bernoulli", see details for glm.ll)
link
text string, link function (depending on family, see details for glm.ll)
b.start
p x 1 vector, start values for linear part
m.start
n x 1 vector, start values for nonparametric part
grid
m x q matrix, where to calculate the nonparametric function (default = t)
offset
offset
method
"speckman" or "backfit"
sort
logical, TRUE if data need to be sorted
weights
binomial weights
weights.trim
trimming weights for fitting the linear part
weights.conv
weights for convergence criterion
max.iter
maximal number of iterations
eps.conv
convergence criterion
kernel
text string, see kernel.function
kernel.product
(if p>1) product or spherical kernel
verbose
print additional convergence information

Value

List with components:

References

Mueller, M. (2001). Estimation and testing in generalized partial linear models -- A comparative study. Statistics and Computing, 11:299--309.

Hastie, T. and Tibshirani, R. (1990). Generalized Additive Models. London: Chapman and Hall.

See Also

kernel.function, kreg

Examples

Run this code
  ## data
  n <- 1000; b <- c(1,-1); rho <- 0.7
  m <- function(t){ 1.5*sin(pi*t) }
  x1 <- runif(n,min=-1,max=1); u  <- runif(n,min=-1,max=1)
  t  <- runif(n,min=-1,max=1); x2 <- round(m(rho*t + (1-rho)*u))
  x  <- cbind(x1,x2)
  y  <- x %*% b + m(t) + rnorm(n)

  ## partial linear model (PLM)
  gh <- kgplm(x,t,y,h=0.25,family="gaussian",link="identity")
  o <- order(t)
  plot(t[o],m(t[o]),type="l",col="green")
  lines(t[o],gh$m[o]); rug(t)

  ## partial linear probit model (GPLM)
  y <- (y>0)
  gh <- kgplm(x,t,y,h=0.25,family="bernoulli",link="probit")

  o <- order(t)
  plot(t[o],m(t[o]),type="l",col="green")
  lines(t[o],gh$m[o]); rug(t)

  ## data with two-dimensional m-function 
  n <- 1000; b <- c(1,-1); rho <- 0.7
  m <- function(t1,t2){ 1.5*sin(pi*t1)+t2 }
  x1 <- runif(n,min=-1,max=1); u  <- runif(n,min=-1,max=1)
  t1 <- runif(n,min=-1,max=1); t2 <- runif(n,min=-1,max=1)
  x2 <- round( m( rho*t1 + (1-rho)*u , t2 ) )
  x  <- cbind(x1,x2); t  <- cbind(t1,t2)
  y  <- x %*% b + m(t1,t2) + rnorm(n)

  ## partial linear model (PLM)
  grid1 <- seq(min(t[,1]),max(t[,1]),length=20)
  grid2 <- seq(min(t[,2]),max(t[,2]),length=25)
  grid  <- create.grid(list(grid1,grid2))

  gh <- kgplm(x,t,y,h=0.5,grid=grid,family="gaussian",link="identity")

  o <- order(grid[,2],grid[,1])
  est.m  <- (matrix(gh$m.grid[o],length(grid1),length(grid2)))
  orig.m <- outer(grid1,grid2,m)
  par(mfrow=c(1,2))
  persp(grid1,grid2,orig.m,main="Original Function",
        theta=30,phi=30,expand=0.5,col="lightblue",shade=0.5)
  persp(grid1,grid2,est.m,main="Estimated Function",
        theta=30,phi=30,expand=0.5,col="lightblue",shade=0.5)
  par(mfrow=c(1,1))

Run the code above in your browser using DataLab