Inception V3 model, with weights pre-trained on ImageNet.
ResNet50 model for Keras.
Base R6 class for Keras callbacks
Activation functions
VGG16 and VGG19 models for Keras.
Xception V1 model for Keras.
Keras backend tensor engine
Bidirectional wrapper for RNNs.
Callback that streams epoch results to a csv file
Stop training when a monitored quantity has stopped improving.
TensorBoard basic visualizations
Configure a Keras model for training
Count the total number of scalars composing the weights.
Boston housing price regression dataset
MaxNorm weight constraint
MinMaxNorm weight constraint
Reuters newswire topics classification
flow_images_from_directory
Generates batches of data from images in a directory (with optional
augmented/normalized data)
Layer/Model configuration
Initializer that generates tensors initialized to a constant value.
Evaluate a Keras model
Update tokenizer internal vocabulary based on a list of texts or list of
sequences.
Generates batches of augmented/normalized data from image data and labels
Create a custom callback
callback_learning_rate_scheduler
Learning rate scheduler.
CIFAR10 small image classification
CIFAR100 small image classification
Evaluates the model on a data generator.
Train a Keras model
Retrieves a layer based on either its name (unique) or index.
Layer/Model weights as R arrays
initializer_random_uniform
Initializer that generates tensors with a uniform distribution.
initializer_truncated_normal
Initializer that generates a truncated normal distribution.
callback_reduce_lr_on_plateau
Reduce learning rate when a metric has stopped improving.
Callback used to stream events to a server.
IMDB Movie reviews sentiment classification
layer_activation_leaky_relu
Leaky version of a Rectified Linear Unit.
layer_activation_parametric_relu
Parametric Rectified Linear Unit.
Average pooling for temporal data.
initializer_glorot_normal
Glorot normal initializer, also called Xavier normal initializer.
initializer_glorot_uniform
Glorot uniform initializer, also called Xavier uniform initializer.
He normal initializer.
Apply an activation function to an output.
MNIST database of handwritten digits
Downloads a file from a URL if it not already in the cache.
Retrieve tensors for layers with multiple nodes
callback_model_checkpoint
Save the model after every epoch.
Callback that prints metrics to stdout.
NonNeg weight constraint
UnitNorm weight constraint
Average pooling operation for spatial data.
Layer that computes a dot product between samples in two tensors.
Applies Dropout to the input.
Loads an image into PIL format.
Converts a PIL Image instance to a 3d-array.
He uniform variance scaling initializer.
Initializer that generates the identity matrix.
Fits the model on data yielded batch-by-batch by a generator.
Fit image data generator internal statistics to some sample data.
Representation of HDF5 dataset to be used instead of an R array
imagenet_decode_predictions
Decodes the prediction of an ImageNet model.
imagenet_preprocess_input
Preprocesses a tensor encoding a batch of images.
Initializer that generates a random orthogonal matrix.
Generate minibatches of image data with real-time data augmentation.
initializer_lecun_uniform
LeCun uniform initializer.
Initializer that generates tensors initialized to 1.
layer_activation_thresholded_relu
Thresholded Rectified Linear Unit.
layer_activity_regularization
Layer that applies an update to the cost function based input activity.
Average pooling operation for 3D data (spatial or spatio-temporal).
layer_batch_normalization
Batch normalization layer (Ioffe and Szegedy, 2014).
initializer_random_normal
Initializer that generates tensors with a normal distribution.
Keras Model
Keras Model composed of a linear stack of layers
Exponential Linear Unit.
Layer that adds a list of inputs.
Layer that averages a list of inputs.
Turns positive integers (indexes) into dense vectors of fixed size.
Flattens an input
Apply multiplicative 1-centered Gaussian noise.
Apply additive zero-centered Gaussian noise.
3D convolution layer (e.g. spatial convolution over volumes).
Convolutional LSTM.
Gated Recurrent Unit - Cho et al.
Layer that computes the maximum (element-wise) a list of inputs.
Layer that multiplies (element-wise) a list of inputs.
Reshapes an output to a certain shape.
Input layer
Wraps arbitrary expression as a layer
layer_locally_connected_1d
Locally-connected layer for 1D inputs.
layer_global_max_pooling_2d
Global max pooling operation for spatial data.
layer_global_max_pooling_3d
Global Max pooling operation for 3D data.
Max pooling operation for spatial data.
Max pooling operation for 3D data (spatial or spatio-temporal).
Fully-connected RNN where the output is to be fed back to input.
Spatial 1D version of Dropout.
initializer_variance_scaling
Initializer capable of adapting its scale to the shape of weights.
Initializer that generates tensors initialized to 0.
Layer that concatenates a list of inputs.
Permute the dimensions of an input according to a given pattern
Zero-padding layer for 2D input (e.g. picture).
Zero-padding layer for 3D data (spatial or spatio-temporal).
Stochastic gradient descent optimizer
Pads each sequence to the same length (length of the longest sequence).
2D convolution layer (e.g. spatial convolution over images).
Transposed convolution layer (sometimes called Deconvolution).
Cropping layer for 1D input (e.g. temporal sequence).
Cropping layer for 2D input (e.g. picture).
Repeats the input n times.
1D convolution layer (e.g. temporal convolution).
Cropping layer for 3D data (e.g. spatial or spatio-temporal).
Add a densely-connected NN layer to an output
layer_global_average_pooling_1d
Global average pooling operation for temporal data.
layer_global_average_pooling_2d
Global average pooling operation for spatial data.
layer_locally_connected_2d
Locally-connected layer for 2D inputs.
Long-Short Term Memory unit - Hochreiter 1997.
Spatial 2D version of Dropout.
Spatial 3D version of Dropout.
Model performance metrics
layer_global_average_pooling_3d
Global Average pooling operation for 3D data.
layer_global_max_pooling_1d
Global max pooling operation for temporal data.
Masks a sequence by using a mask value to skip timesteps.
Depthwise separable 2D convolution.
Upsampling layer for 3D inputs.
Zero-padding layer for 1D input (e.g. temporal sequence).
Pipe operator
Remove the last layer in a model
Returns predictions for a single batch of samples.
Generates probability or class probability predictions for the input samples.
Max pooling operation for temporal data.
Upsampling layer for 1D inputs.
Upsampling layer for 2D inputs.
Model configuration as YAML
Normalize a matrix or nd-array
Adadelta optimizer.
Model loss functions
Generates a word rank-based probabilistic sampling table.
Adam optimizer
Adagrad optimizer.
Save/Load model weights using HDF5 files
Convert a list of sequences into a matrix.
Adamax optimizer
Reset the states for a layer
Save/Load models using HDF5 files
Transform each text in texts in a sequence of integers.
texts_to_sequences_generator
Transforms each text in texts in a sequence of integers.
One-hot encode a text into a list of word indexes in a vocabulary of size n.
Convert text to a sequence of words (or tokens).
Generates skipgram word pairs.
summary.tensorflow.keras.engine.training.Model
Print a summary of a Keras model
Model configuration as JSON
Nesterov Adam optimizer
RMSProp optimizer
Objects exported from other packages
L1 and L2 regularization
Apply a layer to every temporal slice of an input.
Converts a class vector (integers) to binary class matrix.
predict.tensorflow.keras.engine.training.Model
Generate predictions from a Keras model
Generates predictions for the input samples from a data generator.
Text tokenization utility
Convert a list of texts to a matrix.
Single gradient update or model evaluation over one batch of samples.
predict.keras.engine.training.Model
Generate predictions from a Keras model
summary.keras.engine.training.Model
Print a summary of a Keras model