Base R6 class for Keras callbacks
Base R6 class for Keras layers
MobileNet model architecture.
ResNet50 model for Keras.
VGG16 and VGG19 models for Keras.
Xception V1 model for Keras.
Keras backend tensor engine
Bidirectional wrapper for RNNs.
Activation functions
Inception V3 model, with weights pre-trained on ImageNet.
callback_model_checkpoint
Save the model after every epoch.
Callback that prints metrics to stdout.
MinMaxNorm weight constraint
NonNeg weight constraint
TensorBoard basic visualizations
callback_terminate_on_naan
Callback that terminates training when a NaN loss is encountered.
Create a Keras Layer
Boston housing price regression dataset
callback_reduce_lr_on_plateau
Reduce learning rate when a metric has stopped improving.
Callback used to stream events to a server.
Configure a Keras model for training
IMDB Movie reviews sentiment classification
MNIST database of handwritten digits
Loads an image into PIL format.
Converts a PIL Image instance to a 3d-array.
Callback that streams epoch results to a csv file
Stop training when a monitored quantity has stopped improving.
UnitNorm weight constraint
Count the total number of scalars composing the weights.
Fits the model on data yielded batch-by-batch by a generator.
Create a custom callback
callback_learning_rate_scheduler
Learning rate scheduler.
CIFAR10 small image classification
Fit image data generator internal statistics to some sample data.
flow_images_from_directory
Generates batches of data from images in a directory (with optional
augmented/normalized data)
Layer/Model configuration
initializer_glorot_normal
Glorot normal initializer, also called Xavier normal initializer.
CIFAR100 small image classification
Update tokenizer internal vocabulary based on a list of texts or list of
sequences.
Generates batches of augmented/normalized data from image data and labels
Representation of HDF5 dataset to be used instead of an R array
Evaluates the model on a data generator.
Train a Keras model
Downloads a file from a URL if it not already in the cache.
Retrieve tensors for layers with multiple nodes
initializer_glorot_uniform
Glorot uniform initializer, also called Xavier uniform initializer.
initializer_random_uniform
Initializer that generates tensors with a uniform distribution.
initializer_truncated_normal
Initializer that generates a truncated normal distribution.
Generate minibatches of image data with real-time data augmentation.
He normal initializer.
He uniform variance scaling initializer.
layer_activation_thresholded_relu
Thresholded Rectified Linear Unit.
layer_activity_regularization
Layer that applies an update to the cost function based input activity.
Average pooling operation for spatial data.
Average pooling operation for 3D data (spatial or spatio-temporal).
MaxNorm weight constraint
Reuters newswire topics classification
Evaluate a Keras model
Retrieves a layer based on either its name (unique) or index.
Initializer that generates the identity matrix.
LeCun normal initializer.
initializer_lecun_uniform
LeCun uniform initializer.
Initializer that generates tensors initialized to 1.
layer_activation_leaky_relu
Leaky version of a Rectified Linear Unit.
layer_activation_parametric_relu
Parametric Rectified Linear Unit.
initializer_variance_scaling
Initializer capable of adapting its scale to the shape of weights.
Initializer that generates tensors initialized to 0.
Layer that adds a list of inputs.
Transposed 2D convolution layer (sometimes called Deconvolution).
Cropping layer for 3D data (e.g. spatial or spatio-temporal).
Add a densely-connected NN layer to an output
Applies Alpha Dropout to the input.
1D convolution layer (e.g. temporal convolution).
2D convolution layer (e.g. spatial convolution over images).
Turns positive integers (indexes) into dense vectors of fixed size.
layer_global_average_pooling_1d
Global average pooling operation for temporal data.
layer_global_average_pooling_2d
Global average pooling operation for spatial data.
Layer that computes the maximum (element-wise) a list of inputs.
3D convolution layer (e.g. spatial convolution over volumes).
imagenet_decode_predictions
Decodes the prediction of an ImageNet model.
imagenet_preprocess_input
Preprocesses a tensor encoding a batch of images.
Keras Model
Keras Model composed of a linear stack of layers
Layer that multiplies (element-wise) a list of inputs.
Upsampling layer for 1D inputs.
Upsampling layer for 2D inputs.
Model loss functions
Apply an activation function to an output.
Exponential Linear Unit.
layer_batch_normalization
Batch normalization layer (Ioffe and Szegedy, 2014).
Layer that concatenates a list of inputs.
Layer/Model weights as R arrays
Keras implementation
Initializer that generates tensors initialized to a constant value.
Initializer that generates a random orthogonal matrix.
Generates a word rank-based probabilistic sampling table.
Stochastic gradient descent optimizer
Pads each sequence to the same length (length of the longest sequence).
Convert a list of sequences into a matrix.
initializer_random_normal
Initializer that generates tensors with a normal distribution.
Layer that averages a list of inputs.
Average pooling for temporal data.
Transposed 3D convolution layer (sometimes called Deconvolution).
Convolutional LSTM.
Generates skipgram word pairs.
Text tokenization utility
Convert a list of texts to a matrix.
Flattens an input
layer_global_average_pooling_3d
Global Average pooling operation for 3D data.
layer_global_max_pooling_1d
Global max pooling operation for temporal data.
Max pooling operation for spatial data.
Max pooling operation for 3D data (spatial or spatio-temporal).
Upsampling layer for 3D inputs.
Zero-padding layer for 1D input (e.g. temporal sequence).
Model performance metrics
Model configuration as JSON
Layer that computes a dot product between samples in two tensors.
Applies Dropout to the input.
Gated Recurrent Unit - Cho et al.
Apply multiplicative 1-centered Gaussian noise.
Apply additive zero-centered Gaussian noise.
Wraps arbitrary expression as a layer
layer_locally_connected_1d
Locally-connected layer for 1D inputs.
Permute the dimensions of an input according to a given pattern
Repeats the input n times.
Fully-connected RNN where the output is to be fed back to input.
Spatial 1D version of Dropout.
Cropping layer for 1D input (e.g. temporal sequence).
Cropping layer for 2D input (e.g. picture).
layer_global_max_pooling_2d
Global max pooling operation for spatial data.
layer_global_max_pooling_3d
Global Max pooling operation for 3D data.
Remove the last layer in a model
predict.keras.engine.training.Model
Generate predictions from a Keras model
Generates probability or class probability predictions for the input samples.
Objects exported from other packages
summary.keras.engine.training.Model
Print a summary of a Keras model
layer_locally_connected_2d
Locally-connected layer for 2D inputs.
Long-Short Term Memory unit - Hochreiter 1997.
Spatial 2D version of Dropout.
Spatial 3D version of Dropout.
Zero-padding layer for 2D input (e.g. picture).
Zero-padding layer for 3D data (spatial or spatio-temporal).
Adam optimizer
Adamax optimizer
L1 and L2 regularization
Reset the states for a layer
Transform each text in texts in a sequence of integers.
Adadelta optimizer.
Adagrad optimizer.
Pipe operator
plot.keras_training_history
Plot training history
Converts a text to a sequence of indexes in a fixed-size hashing space.
Input layer
Masks a sequence by using a mask value to skip timesteps.
Max pooling operation for temporal data.
Reshapes an output to a certain shape.
texts_to_sequences_generator
Transforms each text in texts in a sequence of integers.
Single gradient update or model evaluation over one batch of samples.
Generates predictions for the input samples from a data generator.
Returns predictions for a single batch of samples.
Apply a layer to every temporal slice of an input.
Converts a class vector (integers) to binary class matrix.
Depthwise separable 2D convolution.
Model configuration as YAML
Normalize a matrix or nd-array
Nesterov Adam optimizer
RMSProp optimizer
Save/Load models using HDF5 files
Save/Load model weights using HDF5 files
Convert text to a sequence of words (or tokens).
One-hot encode a text into a list of word indexes in a vocabulary of size n.