Learn R Programming

RWNN (version 0.4)

rwnn: Random weight neural networks

Description

Set-up and estimate weights of a random weight neural network.

Usage

rwnn(
  formula,
  data = NULL,
  n_hidden = c(),
  lambda = 0,
  type = NULL,
  control = list()
)

# S3 method for formula rwnn( formula, data = NULL, n_hidden = c(), lambda = 0, type = NULL, control = list() )

Value

An RWNN-object.

Arguments

formula

A formula specifying features and targets used to estimate the parameters of the output layer.

data

A data-set (either a data.frame or a tibble) used to estimate the parameters of the output layer.

n_hidden

A vector of integers designating the number of neurons in each of the hidden layers (the length of the list is taken as the number of hidden layers).

lambda

The penalisation constant used when training the output layer.

type

A string indicating whether this is a regression or classification problem.

control

A list of additional arguments passed to the control_rwnn function.

Details

A deep RWNN is constructed by increasing the number of elements in the vector n_hidden. Furthermore, if type is null, then the function tries to deduce it from class of target.

References

Schmidt W., Kraaijveld M., Duin R. (1992) "Feedforward neural networks with random weights." In Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology and Systems, 1–4.

Pao Y., Park G., Sobajic D. (1992) "Learning and generalization characteristics of random vector Functional-link net." Neurocomputing, 6, 163–180.

Huang G.B., Zhu Q.Y., Siew C.K. (2006) "Extreme learning machine: Theory and applications." Neurocomputing, 70(1), 489–501.

Henríquez P.A., Ruz G.A. (2018) "Twitter Sentiment Classification Based on Deep Random Vector Functional Link." In 2018 International Joint Conference on Neural Networks (IJCNN), 1–6.

Examples

Run this code
## Models with a single hidden layer
n_hidden <- 50
lambda <- 0.01

# Regression
m <- rwnn(y ~ ., data = example_data, n_hidden = n_hidden, lambda = lambda)

# Classification
m <- rwnn(I(y > median(y)) ~ ., data = example_data, n_hidden = n_hidden, lambda = lambda)

## Model with multiple hidden layers
n_hidden <- c(20, 15, 10, 5)
lambda <- 0.01

# Combining outputs from all hidden layers (default)
m <- rwnn(y ~ ., data = example_data, n_hidden = n_hidden, lambda = lambda)

# Using only the output of the last hidden layer
m <- rwnn(y ~ ., data = example_data, n_hidden = n_hidden,
          lambda = lambda, control = list(combine_hidden = FALSE))

Run the code above in your browser using DataLab