Base R6 class for Keras callbacks
callback_learning_rate_scheduler
Learning rate scheduler.
Weight constraints
Count the total number of scalars composing the weights.
Reuters newswire topics classification
evaluate.keras.engine.training.Model
Evaluate a Keras model
Freeze and unfreeze weights
Retrieve the next item from a generator
Retrieve tensors for layers with multiple nodes
Retrieves a layer based on either its name (unique) or index.
initializer_glorot_uniform
Glorot uniform initializer, also called Xavier uniform initializer.
He normal initializer.
Initializer that generates tensors initialized to 1.
MobileNet model architecture.
Initializer that generates a random orthogonal matrix.
Returns the index of the minimum value along an axis.
Active Keras backend
Cast an array to the default Keras float type.
MobileNetV2 model architecture
k_categorical_crossentropy
Categorical crossentropy between an output tensor and a target tensor.
VGG16 and VGG19 models for Keras.
1D convolution.
2D convolution.
Xception V1 model for Keras.
Clone a model instance.
compile.keras.engine.training.Model
Configure a Keras model for training
Boston housing price regression dataset
Cumulative product of the values in a tensor, alongside the specified axis.
Cumulative sum of the values in a tensor, alongside the specified axis.
Element-wise exponential.
Adds a 1-sized dimension at index axis
.
CIFAR10 small image classification
Activation functions
Instantiates the DenseNet architecture.
Keras backend tensor engine
Get the uid for the default graph.
Returns the value of a variable.
application_inception_resnet_v2
Inception-ResNet v2 model, with weights trained on ImageNet
Bidirectional wrapper for RNNs.
callback_model_checkpoint
Save the model after every epoch.
Inception V3 model, with weights pre-trained on ImageNet.
fit.keras.engine.training.Model
Train a Keras model
Fits the model on data yielded batch-by-batch by a generator.
ResNet50 model for Keras.
Instantiates a NASNet model.
callback_reduce_lr_on_plateau
Reduce learning rate when a metric has stopped improving.
Callback used to stream events to a server.
Returns the shape of tensor or variable as a list of int or NULL entries.
Returns whether x
is a Keras tensor.
Callback that prints metrics to stdout.
IMDB Movie reviews sentiment classification
Base R6 class for Keras layers
Computes log(sum(exp(elements across dimensions of a tensor))).
Create a Keras Layer
Fit image data generator internal statistics to some sample data.
Create a Keras Wrapper
Base R6 class for Keras wrappers
Update tokenizer internal vocabulary based on a list of texts or list of
sequences.
Callback that streams epoch results to a csv file
k_manual_variable_initialization
Sets the manual variable initialization flag.
Stop training when a monitored quantity has stopped improving.
Layer/Model weights as R arrays
Layer/Model configuration
Representation of HDF5 dataset to be used instead of an R array
initializer_glorot_normal
Glorot normal initializer, also called Xavier normal initializer.
Initializer that generates tensors initialized to a constant value.
TensorBoard basic visualizations
MNIST database of handwritten digits
Downloads a file from a URL if it not already in the cache.
He uniform variance scaling initializer.
initializer_truncated_normal
Initializer that generates a truncated normal distribution.
Initializer that generates the identity matrix.
Generates batches of augmented/normalized data from image data and labels
Loads an image into PIL format.
Generate batches of image data with real-time data augmentation. The data will be
looped over (in batches).
flow_images_from_directory
Generates batches of data from images in a directory (with optional
augmented/normalized data)
callback_terminate_on_naan
Callback that terminates training when a NaN loss is encountered.
imagenet_preprocess_input
Preprocesses a tensor or array encoding a batch of images.
Keras implementation
Minimum value in a tensor.
Check if Keras is Available
initializer_random_normal
Initializer that generates tensors with a normal distribution.
Element-wise absolute value.
initializer_random_uniform
Initializer that generates tensors with a uniform distribution.
Creates a 1D tensor containing a sequence of integers.
CIFAR100 small image classification
initializer_variance_scaling
Initializer capable of adapting its scale to the shape of weights.
Bitwise reduction (logical AND).
Returns the value of more than one tensor variable.
Returns the index of the maximum value along an axis.
Fashion-MNIST database of fashion articles
Binary crossentropy between an output tensor and a target tensor.
Sets the values of many tensor variables at once.
Applies batch normalization on x given mean, var, beta and gamma.
Bitwise reduction (logical OR).
Element-wise minimum of two tensors.
Reverse a tensor along the specified axes.
Casts a tensor to a different dtype and returns it.
Evaluates the model on a data generator.
3D deconvolution (i.e. transposed convolution).
Returns a tensor with random binomial distribution of values.
Computes cos of x element-wise.
Destroys the current TF graph and creates a new one.
Returns the static number of elements in a Keras variable or tensor.
Element-wise value clipping.
Adds a bias vector to a tensor.
Runs CTC loss algorithm on each batch element.
export_savedmodel.keras.engine.training.Model
Export a Saved Model
Depthwise 2D convolution with separable filters.
3D array representation of images
Element-wise sign.
Decodes the output of a softmax.
Returns a tensor with normal distribution of values.
Exponential linear unit.
Reduce elems using fn to combine them from right to left.
Fuzz factor used in numeric expressions.
Instantiates a Keras function
imagenet_decode_predictions
Decodes the prediction of an ImageNet model.
Creates a tensor by tiling x
by n
.
initializer_lecun_uniform
LeCun uniform initializer.
Initializer that generates tensors initialized to 0.
LeCun normal initializer.
Multiplies 2 tensors (and/or variables) and returns a tensor .
Element-wise equality between two tensors.
k_ctc_label_dense_to_sparse
Converts CTC labels from dense to sparse.
Install Keras and the TensorFlow backend
Batchwise dot product.
Iterates over the time dimension of a tensor
Turn a nD tensor into a 2D tensor with same 1st dimension.
Retrieves the elements of indices indices
in the tensor reference
.
Evaluates the value of a variable.
Instantiate an identity matrix and returns it.
Concatenates a list of tensors alongside the specified axis.
Flatten a tensor.
Default image data format convention ('channels_first' or 'channels_last').
TF session to be used by the backend.
Selects x
in test phase, and alt
otherwise.
Creates a constant tensor.
Segment-wise linear approximation of sigmoid.
Returns the gradients of variables
w.r.t. loss
.
Returns the shape of a variable.
Returns the learning phase flag.
Exponential Linear Unit.
2D deconvolution (i.e. transposed convolution).
Returns a tensor with the same content as the input tensor.
3D convolution.
Returns whether x
is a symbolic tensor.
Max pooling operation for spatial data.
Computes sin of x element-wise.
Compute the moving average of a variable.
Element-wise truth value of (x < y).
Removes a 1-dimension from the tensor at index axis
.
Sets entries in x
to zero at random, while scaling the entire tensor.
Stacks a list of rank R
tensors into a rank R+1
tensor.
Create a Keras custom model
Element-wise truth value of (x <= y).
Converts a sparse tensor into a dense tensor and returns it.
Normalizes a tensor wrt the L2 norm alongside the specified axis.
Keras Model composed of a linear stack of layers
Returns the dtype of a Keras tensor or variable, as a string.
Apply an activation function to an output.
2D convolution layer (e.g. spatial convolution over images).
Default float type
Element-wise maximum of two tensors.
Selects x
in train phase, and alt
otherwise.
Returns whether the targets
are in the top k
predictions
.
Apply 2D conv with un-shared weights.
Mean of a tensor, alongside the specified axis.
Reduce elems using fn to combine them from left to right.
Apply 1D conv with un-shared weights.
Computes the one-hot representation of an integer tensor.
Instantiates an all-ones tensor variable and returns it.
Element-wise truth value of (x > y).
Element-wise truth value of (x >= y).
Average pooling operation for spatial data.
layer_activation_leaky_relu
Leaky version of a Rectified Linear Unit.
Element-wise log.
Returns whether x
is a placeholder.
layer_global_average_pooling_3d
Global Average pooling operation for 3D data.
Instantiates a placeholder tensor and returns it.
k_random_uniform_variable
Instantiates a variable with values drawn from a uniform distribution.
k_normalize_batch_in_training
Computes mean and std for batch then apply batch_normalization on batch.
Returns whether a tensor is a sparse tensor.
Instantiates a variable with values drawn from a normal distribution.
Returns the number of axes in a tensor, as an integer.
Average pooling for temporal data.
Depthwise separable 2D convolution.
Reset graph identifiers.
Returns a tensor with uniform distribution of values.
Add a densely-connected NN layer to an output
Element-wise inequality between two tensors.
Turns positive integers (indexes) into dense vectors of fixed size.
Softplus of a tensor.
Prints message
and the tensor value when evaluated.
2D Pooling.
Layer that subtracts two inputs.
Wraps arbitrary expression as a layer
Element-wise square.
Reshapes a tensor to the specified shape.
Map the function fn over the elements elems and return the outputs.
Standard deviation of a tensor, alongside the specified axis.
Element-wise rounding to the closest integer.
Multiplies the values in a tensor, alongside the specified axis.
Resizes the images contained in a 4D tensor.
Repeats the elements of a tensor along an axis.
Maximum value in a tensor.
Masks a sequence by using a mask value to skip timesteps.
Rectified linear unit.
Scaled Exponential Linear Unit.
layer_global_max_pooling_1d
Global max pooling operation for temporal data.
Instantiates an all-zeros variable and returns it.
Instantiates an all-ones variable of the same shape as another tensor.
Returns the symbolic shape of a tensor or variable.
2D convolution with separable filters.
Sum of the values in a tensor, alongside the specified axis.
layer_locally_connected_1d
Locally-connected layer for 1D inputs.
Permutes axes in a tensor.
Max pooling operation for 3D data (spatial or spatio-temporal).
Returns variables
but with zero gradient w.r.t. every other variable.
Element-wise sigmoid.
Model performance metrics
Resizes the volume contained in a 5D tensor.
Update the value of x
by subtracting decrement
.
Softmax of a tensor.
layer_activation_thresholded_relu
Thresholded Rectified Linear Unit.
Update the value of x
by adding increment
.
Element-wise square root.
Pads the 2nd and 3rd dimensions of a 4D tensor.
Pads 5D tensor with zeros along the depth, height, width dimensions.
layer_global_average_pooling_2d
Global average pooling operation for spatial data.
Keras array object
Softsign of a tensor.
3D Pooling.
Update the value of x
to new_x
.
Element-wise tanh.
Upsampling layer for 1D inputs.
Pads the middle dimension of a 3D tensor.
R interface to Keras
Instantiates an all-zeros variable of the same shape as another tensor.
Adagrad optimizer.
Switches between two operations depending on a scalar value.
Sets the learning phase to a fixed value.
Softmax activation function.
Element-wise exponentiation.
Transposed 2D convolution layer (sometimes called Deconvolution).
Upsampling layer for 2D inputs.
Fast GRU implementation backed by CuDNN . 3D convolution layer (e.g. spatial convolution over volumes).
Keras Model
Repeats the input n times.
Apply multiplicative 1-centered Gaussian noise.
Depthwise separable 1D convolution.
Variance of a tensor, alongside the specified axis.
Adam optimizer
Cropping layer for 2D input (e.g. picture).
Layer that adds a list of inputs.
layer_activation_parametric_relu
Parametric Rectified Linear Unit.
Repeats a 2D tensor.
Rectified Linear Unit activation function
Generates a word rank-based probabilistic sampling table.
Returns predictions for a single batch of samples.
Fast LSTM implementation backed by CuDNN . 1D convolution layer (e.g. temporal convolution).
k_sparse_categorical_crossentropy
Categorical crossentropy with integer targets.
layer_global_max_pooling_2d
Global max pooling operation for spatial data.
Layer that concatenates a list of inputs.
layer_activity_regularization
Layer that applies an update to the cost function based input activity.
Sets the value of a variable, from an R array.
Transposes a tensor and returns it.
Instantiates a variable and returns it.
Applies Alpha Dropout to the input.
Permute the dimensions of an input according to a given pattern
Gated Recurrent Unit - Cho et al.
Layer that computes the maximum (element-wise) a list of inputs.
layer_global_max_pooling_3d
Global Max pooling operation for 3D data.
Transposed 3D convolution layer (sometimes called Deconvolution).
Convert a list of sequences into a matrix.
RMSProp optimizer
Layer that averages a list of inputs.
Apply additive zero-centered Gaussian noise.
Returns a tensor with truncated random normal distribution of values.
Input layer
Average pooling operation for 3D data (spatial or spatio-temporal).
texts_to_sequences_generator
Transforms each text in texts in a sequence of integers.
Generates probability or class probability predictions for the input samples.
layer_locally_connected_2d
Locally-connected layer for 2D inputs.
Convolutional LSTM.
Max pooling operation for temporal data.
Cropping layer for 3D data (e.g. spatial or spatio-temporal).
Zero-padding layer for 1D input (e.g. temporal sequence).
Flattens an input
Separable 2D convolution.
Layer that multiplies (element-wise) a list of inputs.
layer_global_average_pooling_1d
Global average pooling operation for temporal data.
layer_batch_normalization
Batch normalization layer (Ioffe and Szegedy, 2014).
Serialize a model to an R object
Apply a layer to every temporal slice of an input.
Layer that computes a dot product between samples in two tensors.
Reshapes an output to a certain shape.
Model configuration as JSON
Zero-padding layer for 3D data (spatial or spatio-temporal).
Spatial 2D version of Dropout.
Cropping layer for 1D input (e.g. temporal sequence).
Single gradient update or model evaluation over one batch of samples.
Layer that computes the minimum (element-wise) a list of inputs.
Applies Dropout to the input.
Upsampling layer for 3D inputs.
Long Short-Term Memory unit - Hochreiter 1997.
Fully-connected RNN where the output is to be fed back to input.
Save/Load models using HDF5 files
Spatial 3D version of Dropout.
Normalize a matrix or nd-array
Convert a list of texts to a matrix.
Spatial 1D version of Dropout.
Stochastic gradient descent optimizer
Model configuration as YAML
Save/Load model weights using HDF5 files
Zero-padding layer for 2D input (e.g. picture).
Adadelta optimizer.
Assign values to names
Reset the states for a layer
Generates skipgram word pairs.
Convert text to a sequence of words (or tokens).
plot.keras_training_history
Plot training history
Converts a text to a sequence of indexes in a fixed-size hashing space.
Generates predictions for the input samples from a data generator.
Adamax optimizer
Utility function for generating batches of temporal data.
Model loss functions
Save a text tokenizer to an external file
Text tokenization utility
predict.keras.engine.training.Model
Generate predictions from a Keras model
Replicates a model on different GPUs.
Nesterov Adam optimizer
Remove the last layer in a model
Objects exported from other packages
Pads sequences to the same length
Pipe operator
Provide a scope with mappings of names to custom objects
L1 and L2 regularization
Transform each text in texts in a sequence of integers.
Select a Keras implementation and backend
summary.keras.engine.training.Model
Print a summary of a Keras model
One-hot encode a text into a list of word indexes in a vocabulary of size n.
Converts a class vector (integers) to binary class matrix.
Base R6 class for Keras constraints
Create a custom callback