Learn R Programming

sisireg (version 1.2.1)

ssrmlp2_train: 2-layer MLP with partial sum optimization - reworked

Description

Calculates the weights of a 2-layer MLP with respect to the partial sums critereon

Usage

ssrmlp2_train(X, Y, std=TRUE, opt='ps', hl=NULL, W=NULL, 
  k=NULL, fn=NULL, eta=0.5, accept = 10, maxIter=1000, alpha=NULL, beta=NULL)

Value

W

List with weight matrices.

Arguments

X

matrix with n-dimensional coordinates.

Y

array with observations.

std

optional: standardizing values if TRUE.

opt

optional: optimizing function ('simple', 'ps', 'lse').

hl

optional: array tupel with number of perceptrons in each layer.

W

optional: previously calculates weights for refining the model.

k

optional: number of neighbors per quadrant.

fn

optional: quantile for partial sums.

eta

optional: constant step width of the gradient algorithm (eta=0.0 for Armijo).

accept

optional: percentage of acceptable deviations from Partial Sum critereon regarding number and quantile.

maxIter

optional: number of iterations for the numeric solver (maxIter=1000).

alpha

optional: weight parameter for function to be minimized.

beta

optional: weight parameter for side condition.

Author

Dr. Lars Metzner

References

Dr. Lars Metzner (2021) Adäquates Maschinelles Lernen. Independently Published.

Examples

Run this code
# \donttest{
# generate data
set.seed(42)
x <- rnorm(300)
y <- rnorm(300)
z <- rnorm(300) + atan2(x, y)
# coordinates
X <- matrix(cbind(x,y), ncol = 2)
Y <- as.double(z)
# Training
W <- ssrmlp2_train(X, Y)
# }

Run the code above in your browser using DataLab