Learn R Programming

PredictiveRegression (version 0.1-4)

iidpred: IID predictor

Description

Prediction intervals based on the IID model

Usage

iidpred(train,test,epsilons=c(0.05,0.01),ridge=0)

Arguments

train
Training set as a matrix of size $N$ times $K+1$. Each row describes an observation. Columns 1 to $K$ are the explanatory variables, and column $K+1$ is the response variables.
test
Test set as a matrix of size $N2$ times $K$. Each row corresponds to an observation (but without the response variable). Columns 1 to $K$ are the explanatory variables.
epsilons
Vector of several significance levels. Each significance level epsilons[$j$] is a number between 0 and 1. The default value is (5%,1%).
ridge
Ridge coefficient, a nonnegative number. The default value is 0; setting it to a small positive constant might lead to more stable results.

Value

output[[1]]
The matrix of lower bounds of prediction intervals. Its size is $N2$ times $Neps$, where $N2$ is the number of test observations and $Neps$ is the number of significance levels. The element output[[1]][$i$,$j$] of output[[1]] is the lower bound $a$ of the prediction interval $[a,b]$ for the $i-$th test observation and for the $j-$th significance level epsilons[$j$] in the vector epsilons.
output[[2]]
The matrix of upper bounds $b$, with the same structure as output[[1]]. Typically $a =$output[[1]][$i$,$j$] and $b = $output[[2]][$i$,$j$] are real numbers such that $a <= b$.="" exceptions:="" $a$="" is="" allowed="" to="" be="" $-infinity$="" and="" $b$="" $infinity$;="" the="" only="" case="" where="" $a=""> b$ is $a = infinity$ and $b = -infinity$ (the empty prediction $[a,b]$).
output[[3]]
The termination code: 0 = normal termination; 1 = illegal parameters (the training and test sets have different numbers of explanatory variables); 2 = too few observations for all significance levels.

References

Vovk, V., Nouretdinov, I., and Gammerman, A. (2009) On-line predictive linear regression. Annals of Statistics 37, 1566 - 1590. The new arXiv version http://arxiv.org/abs/math/0511522 of this paper contains the description of this program and the algorithm that this program implements.

Vovk, V., Gammerman, A., and Shafer, G. (2005) Algorithmic Learning in a Random World. New York: Springer. This program implements the algorithm described on pages 30 - 34 of this book.

Examples

Run this code
  train <- matrix(c(0,10,20,30, 1.01,10.99,21.01,30.99), nrow=4, ncol=2);
  test <- matrix(c(5,15,25), nrow=3, ncol=1);
  output <- iidpred(train,test,c(0.05,0.2),0.01);
  print(output[[1]]);
  print(output[[2]]);

Run the code above in your browser using DataLab