Learn R Programming

⚠️There's a newer version (1.2.0) of this package.Take me there.

Ruta

Software for unsupervised deep architectures

Get uncomplicated access to unsupervised deep neural networks, from building their architecture to their training and evaluation

Get started

Installation

Dependencies

Ruta is based in the well known open source deep learning library Keras and its R interface. It has been developed to work with the TensorFlow backend. In order to install these dependencies you will need the Python interpreter as well, and you can install them via the Python package manager pip or possibly your distro's package manager if you are running Linux.

$ sudo pip install tensorflow
$ sudo pip install keras

Otherwise, you can follow the official installation guides:

Ruta package

# Just get Ruta from the CRAN
install.packages("ruta")

# Or get the latest development version from GitHub
devtools::install_github("fdavidcl/ruta")

All R dependencies will be automatically installed. These include the Keras R interface and purrr. For convenience we also recommend installing and loading either magrittr or purrr, so that the pipe operator %>% is available.

Usage

The easiest way to start working with Ruta is to use the autoencode() function. It allows for selecting a type of autoencoder and transforming the feature space of a data set onto another one with some desirable properties depending on the chosen type.

iris[, 1:4] %>% as.matrix %>% autoencode(2, type = "denoising")

You can learn more about different variants of autoencoders by reading A practical tutorial on autoencoders for nonlinear feature fusion.

Ruta provides the functionality to build diverse neural architectures (see autoencoder()), train them as autoencoders (see train()) and perform different tasks with the resulting models (see reconstruct()), including evaluation (see evaluate_mean_squared_error()). The following is a basic example of a natural pipeline with an autoencoder:

library(ruta)
library(purrr)

# Shuffle and normalize dataset
x <- iris[, 1:4] %>% sample %>% as.matrix %>% scale
x_train <- x[1:100, ]
x_test <- x[101:150, ]

autoencoder(
  input() + dense(256) + dense(36, "tanh") + dense(256) + output("sigmoid"),
  loss = "mean_squared_error"
) %>%
  make_contractive(weight = 1e-4) %>%
  train(x_train, epochs = 40) %>%
  evaluate_mean_squared_error(x_test)

For more details, see other examples and the documentation.

Copy Link

Version

Install

install.packages('ruta')

Monthly Downloads

29

Version

1.0.2

License

GPL (>= 3) | file LICENSE

Issues

Pull Requests

Stars

Forks

Maintainer

David Charte

Last Published

May 8th, 2018

Functions in ruta (1.0.2)

is_trained

Detect trained models
to_keras.ruta_layer_variational

Obtain a Keras block of layers for the variational autoencoder
noise

Noise generator
noise_gaussian

Additive Gaussian noise
noise_cauchy

Additive Cauchy noise
noise_ones

Filter to add ones noise
generate.ruta_autoencoder_variational

Generate samples from a generative model
is_variational

Detect whether an autoencoder is variational
to_keras.ruta_loss_contraction

Obtain a Keras loss
variational_block

Create a variational block of layers
weight_decay

Weight decay
to_keras.ruta_sparsity

Translate sparsity regularization to Keras regularizer
make_sparse

Add sparsity regularization to an autoencoder
new_autoencoder

Create an autoencoder learner
is_denoising

Detect whether an autoencoder is denoising
noise_saltpepper

Filter to add salt-and-pepper noise
to_keras.ruta_network

Build a Keras network
noise_zeros

Filter to add zero noise
is_contractive

Detect whether an autoencoder is contractive
evaluation_metric

Custom evaluation metrics
is_robust

Detect whether an autoencoder is robust
is_sparse

Detect whether an autoencoder is sparse
new_layer

Layer wrapper constructor
reconstruct

Retrieve reconstructions for input data
print.ruta_autoencoder

Inspect Ruta objects
new_network

Sequential network constructor
input

Create an input layer
to_keras.ruta_weight_decay

Obtain a Keras weight decay
+.ruta_network

Add layers to a network/Join networks
train.ruta_autoencoder

Train a learner object with data
make_contractive

Add contractive behavior to any autoencoder
loss_variational

Variational loss
make_denoising

Add denoising behavior to any autoencoder
layer_keras

Custom layer from Keras
save_as

Save and load Ruta models
sparsity

Sparsity regularization
output

Create an output layer
make_robust

Add robust behavior to any autoencoder
plot.ruta_network

Draw a neural network
[.ruta_network

Access subnetworks of a network
to_keras.ruta_autoencoder

Extract Keras models from an autoencoder wrapper
to_keras.ruta_layer_input

Convert Ruta layers onto Keras layers
to_keras

Convert a Ruta object onto Keras objects and functions
autoencoder

Create an autoencoder learner
autoencoder_contractive

Create a contractive autoencoder
autoencoder_robust

Create a robust autoencoder
autoencode

Automatically compute an encoding of a data matrix
evaluate_mean_squared_error

Evaluation metrics
encode

Retrieve encoding of data
decode

Retrieve decoding of encoded data
correntropy

Correntropy loss
contraction

Contractive loss
autoencoder_sparse

Sparse autoencoder
encoding_index

Get the index of the encoding
dropout

Dropout layer
dense

Create a fully-connected neural layer
add_weight_decay

Add weight decay to any autoencoder
as_loss

Coercion to ruta_loss
as_network

Coercion to ruta_network
apply_filter.ruta_noise_zeros

Apply filters
autoencoder_denoising

Create a denoising autoencoder
autoencoder_variational

Build a variational autoencoder