Rfast (version 1.7.3)

Many simple linear regressions coefficients: Simple linear regressions coefficients

Description

Simple linear regressions coefficients.

Usage

allbetas(y, x, pvalue = FALSE, logged = FALSE)

Arguments

y
A numerical vector with the response variable. If the y contains proportions or percentages, i.e. values between 0 and 1, the logit transformation is applied first and the transformed data are used.
x
A matrix with the data, where rows denotes the observations and the columns contain the independent variables.
pvalue
If you want a hypothesis test that each slope (beta coefficient) is equal to zero set this equal to TRUE. It will also produce all the correlations between y and x.
logged
A boolean variable; it will return the logarithm of the pvalue if set to TRUE.

Value

A matrix with the constant (alpha) and the slope (beta) for each simple linear regression. If the p-value is set to TRUE, the correlation of each y with the x is calculated along with the relevant test statistic and its associated p-value.

See Also

mvbetas, correls, univglms, colsums, colVars

Examples

Run this code
x <- matrix( rnorm(100 * 1000), ncol = 1000 )
y <- rnorm(100)
r <- cor(y, x)  ## correlation of y with each of the xs
a <- allbetas(y, x)  ## the coefficients of each simple linear regression of y with x
b <- matrix(nrow = 1000, ncol = 2)
for (i in 1:1000) b[i, ] = coef( lm.fit( cbind(1,x[,i]), y ) )

x <- matrix( rnorm(100 * 1000), ncol = 1000 )
y <- rnorm(100)
system.time( allbetas(y, x) )
system.time(  for (i in 1:1000) b[i, ] = coef( lm.fit( cbind(1, x[,i]), y ) )  )

Run the code above in your browser using DataLab