Base R6 class for Keras constraints
application_inception_resnet_v2
Inception-ResNet v2 model, with weights trained on ImageNet
Base R6 class for Keras layers
Fits the state of the preprocessing layer to the data being passed.
Create a custom Layer
Base R6 class for Keras callbacks
Instantiates the DenseNet architecture.
Base R6 class for Keras wrappers
Inception V3 model, with weights pre-trained on ImageNet.
Activation functions
Instantiates a NASNet model.
Xception V1 model for Keras.
VGG16 and VGG19 models for Keras.
ResNet50 model for Keras.
Callback used to stream events to a server.
callback_reduce_lr_on_plateau
Reduce learning rate when a metric has stopped improving.
Callback that prints metrics to stdout.
Bidirectional wrapper for RNNs.
callback_model_checkpoint
Save the model after every epoch.
Keras backend tensor engine
callback_terminate_on_naan
Callback that terminates training when a NaN loss is encountered.
Boston housing price regression dataset
TensorBoard basic visualizations
CIFAR10 small image classification
IMDB Movie reviews sentiment classification
MNIST database of handwritten digits
Fits the model on data yielded batch-by-batch by a generator.
fit.keras.engine.training.Model
Train a Keras model
MobileNet model architecture.
Create a custom callback
Create a Keras Layer
callback_learning_rate_scheduler
Learning rate scheduler.
MobileNetV2 model architecture
Create a Keras Wrapper
Fit image data generator internal statistics to some sample data.
Update tokenizer internal vocabulary based on a list of texts or list of
sequences.
Generates batches of augmented/normalized data from image data and labels
Retrieves a layer based on either its name (unique) or index.
Callback that streams epoch results to a csv file
flow_images_from_directory
Generates batches of data from images in a directory (with optional
augmented/normalized data)
Freeze and unfreeze weights
initializer_glorot_normal
Glorot normal initializer, also called Xavier normal initializer.
initializer_glorot_uniform
Glorot uniform initializer, also called Xavier uniform initializer.
Get the vocabulary for text vectorization layers
Clone a model instance.
compile.keras.engine.training.Model
Configure a Keras model for training
Weight constraints
Count the total number of scalars composing the weights.
Stop training when a monitored quantity has stopped improving.
LeCun normal initializer.
Initializer that generates the identity matrix.
Reuters newswire topics classification
evaluate.keras.engine.training.Model
Evaluate a Keras model
initializer_random_uniform
Initializer that generates tensors with a uniform distribution.
CIFAR100 small image classification
image_dataset_from_directory
Create a dataset from a directory
Initializer that generates tensors initialized to 0.
initializer_variance_scaling
Initializer capable of adapting its scale to the shape of weights.
Generate batches of image data with real-time data augmentation. The data will be
looped over (in batches).
Retrieve tensors for layers with multiple nodes
flow_images_from_dataframe
Takes the dataframe and the path to a directory and generates batches of
augmented/normalized data.
Downloads a file from a URL if it not already in the cache.
Fashion-MNIST database of fashion articles
Evaluates the model on a data generator.
initializer_truncated_normal
Initializer that generates a truncated normal distribution.
export_savedmodel.keras.engine.training.Model
Export a Saved Model
imagenet_preprocess_input
Preprocesses a tensor or array encoding a batch of images.
imagenet_decode_predictions
Decodes the prediction of an ImageNet model.
He normal initializer.
Keras implementation
Layer/Model weights as R arrays
Retrieve the next item from a generator
Layer/Model configuration
He uniform variance scaling initializer.
Element-wise value clipping.
Representation of HDF5 dataset to be used instead of an R array
Returns the index of the maximum value along an axis.
Initializer that generates tensors initialized to a constant value.
Returns the index of the minimum value along an axis.
Binary crossentropy between an output tensor and a target tensor.
Destroys the current TF graph and creates a new one.
Sets entries in x
to zero at random, while scaling the entire tensor.
Creates a 1D tensor containing a sequence of integers.
Bitwise reduction (logical OR).
Multiplies 2 tensors (and/or variables) and returns a tensor .
k_categorical_crossentropy
Categorical crossentropy between an output tensor and a target tensor.
Adds a bias vector to a tensor.
Active Keras backend
Loads an image into PIL format.
Concatenates a list of tensors alongside the specified axis.
Default float type
Element-wise truth value of (x > y).
Returns the gradients of variables
w.r.t. loss
.
Flatten a tensor.
Returns a tensor with the same content as the input tensor.
Initializer that generates a random orthogonal matrix.
3D array representation of images
initializer_random_normal
Initializer that generates tensors with a normal distribution.
Default image data format convention ('channels_first' or 'channels_last').
Selects x
in test phase, and alt
otherwise.
initializer_lecun_uniform
LeCun uniform initializer.
Element-wise absolute value.
k_random_uniform_variable
Instantiates a variable with values drawn from a uniform distribution.
Compute the moving average of a variable.
Element-wise minimum of two tensors.
Returns a tensor with uniform distribution of values.
Initializer that generates tensors initialized to 1.
Applies batch normalization on x given mean, var, beta and gamma.
Computes cos of x element-wise.
Batchwise dot product.
Returns the static number of elements in a Keras variable or tensor.
Returns the dtype of a Keras tensor or variable, as a string.
Install Keras and the TensorFlow backend
Runs CTC loss algorithm on each batch element.
2D convolution.
2D deconvolution (i.e. transposed convolution).
Sets the values of many tensor variables at once.
Check if Keras is Available
Turn a nD tensor into a 2D tensor with same 1st dimension.
Exponential linear unit.
Returns the value of a variable.
Returns the shape of a variable.
Bitwise reduction (logical AND).
Decodes the output of a softmax.
2D convolution with separable filters.
Returns the shape of tensor or variable as a list of int or NULL entries.
Selects x
in train phase, and alt
otherwise.
3D convolution.
Returns the value of more than one tensor variable.
Apply 2D conv with un-shared weights.
Apply 1D conv with un-shared weights.
Returns the number of axes in a tensor, as an integer.
Evaluates the value of a variable.
Cast an array to the default Keras float type.
Casts a tensor to a different dtype and returns it.
k_normalize_batch_in_training
Computes mean and std for batch then apply batch_normalization on batch.
Returns whether the targets
are in the top k
predictions
.
Instantiates an all-ones tensor variable and returns it.
Returns the learning phase flag.
Normalizes a tensor wrt the L2 norm alongside the specified axis.
Multiplies the values in a tensor, alongside the specified axis.
Instantiates an all-ones variable of the same shape as another tensor.
Cumulative sum of the values in a tensor, alongside the specified axis.
Creates a constant tensor.
1D convolution.
Depthwise 2D convolution with separable filters.
Element-wise equality between two tensors.
Fuzz factor used in numeric expressions.
3D deconvolution (i.e. transposed convolution).
k_ctc_label_dense_to_sparse
Converts CTC labels from dense to sparse.
Sets the learning phase to a fixed value.
Maximum value in a tensor.
Retrieves the elements of indices indices
in the tensor reference
.
Segment-wise linear approximation of sigmoid.
Instantiates a Keras function
Element-wise truth value of (x >= y).
Repeats the elements of a tensor along an axis.
Element-wise exponentiation.
Reset graph identifiers.
Prints message
and the tensor value when evaluated.
Element-wise maximum of two tensors.
Element-wise exponential.
Element-wise log.
Computes log(sum(exp(elements across dimensions of a tensor))).
Reduce elems using fn to combine them from right to left.
Reduce elems using fn to combine them from left to right.
Resizes the images contained in a 4D tensor.
Reshapes a tensor to the specified shape.
Adds a 1-sized dimension at index axis
.
TF session to be used by the backend.
Instantiate an identity matrix and returns it.
Cumulative product of the values in a tensor, alongside the specified axis.
k_manual_variable_initialization
Sets the manual variable initialization flag.
Permutes axes in a tensor.
Map the function fn over the elements elems and return the outputs.
Instantiates a placeholder tensor and returns it.
Returns whether x
is a placeholder.
Returns whether x
is a Keras tensor.
Returns a tensor with random binomial distribution of values.
Converts a sparse tensor into a dense tensor and returns it.
Softplus of a tensor.
Element-wise rounding to the closest integer.
Element-wise sigmoid.
Iterates over the time dimension of a tensor
Element-wise sign.
layer_activation_parametric_relu
Parametric Rectified Linear Unit.
Rectified Linear Unit activation function
Transposes a tensor and returns it.
Instantiates a variable with values drawn from a normal distribution.
Sets the value of a variable, from an R array.
Returns a tensor with normal distribution of values.
Creates attention layer
Layer that averages a list of inputs.
layer_batch_normalization
Batch normalization layer (Ioffe and Szegedy, 2014).
Average pooling operation for 3D data (spatial or spatio-temporal).
Element-wise truth value of (x < y).
Mean of a tensor, alongside the specified axis.
Element-wise truth value of (x <= y).
Minimum value in a tensor.
Element-wise inequality between two tensors.
Computes the one-hot representation of an integer tensor.
Resizes the volume contained in a 5D tensor.
Stacks a list of rank R
tensors into a rank R+1
tensor.
Softsign of a tensor.
Standard deviation of a tensor, alongside the specified axis.
Exponential Linear Unit.
Reverse a tensor along the specified axes.
layer_activation_leaky_relu
Leaky version of a Rectified Linear Unit.
Layer that adds a list of inputs.
Returns whether a tensor is a sparse tensor.
Get the uid for the default graph.
Returns whether x
is a symbolic tensor.
2D Pooling.
Rectified linear unit.
Repeats a 2D tensor.
Returns the symbolic shape of a tensor or variable.
3D Pooling.
Switches between two operations depending on a scalar value.
Element-wise tanh.
Variance of a tensor, alongside the specified axis.
Pads the 2nd and 3rd dimensions of a 4D tensor.
k_sparse_categorical_crossentropy
Categorical crossentropy with integer targets.
Turns positive integers (indexes) into dense vectors of fixed size.
Max pooling operation for temporal data.
Flattens an input
1D convolution layer (e.g. temporal convolution).
Applies Alpha Dropout to the input.
Layer that concatenates a list of inputs.
Cropping layer for 1D input (e.g. temporal sequence).
Cropping layer for 2D input (e.g. picture).
Max pooling operation for spatial data.
Depthwise separable 1D convolution.
Computes sin of x element-wise.
Separable 2D convolution.
Fully-connected RNN where the output is to be fed back to input.
Pads the middle dimension of a 3D tensor.
Instantiates a variable and returns it.
Transposed 3D convolution layer (sometimes called Deconvolution).
Creates a tensor by tiling x
by n
.
Element-wise square.
Removes a 1-dimension from the tensor at index axis
.
Softmax of a tensor.
Pads 5D tensor with zeros along the depth, height, width dimensions.
Layer that computes a dot product between samples in two tensors.
Convolutional LSTM.
Applies Dropout to the input.
Instantiates an all-zeros variable and returns it.
Update the value of x
by adding increment
.
Load a Keras model from the Saved Model format
Adamax optimizer
Spatial 1D version of Dropout.
Model configuration as JSON
Returns variables
but with zero gradient w.r.t. every other variable.
Returns a tensor with truncated random normal distribution of values.
R interface to Keras
Element-wise square root.
Sum of the values in a tensor, alongside the specified axis.
Update the value of x
to new_x
.
layer_global_average_pooling_1d
Global average pooling operation for temporal data.
layer_global_average_pooling_2d
Global average pooling operation for spatial data.
Long Short-Term Memory unit - Hochreiter 1997.
Masks a sequence by using a mask value to skip timesteps.
Update the value of x
by subtracting decrement
.
Average pooling for temporal data.
Transposed 1D convolution layer (sometimes called Deconvolution).
Average pooling operation for spatial data.
Keras array object
2D convolution layer (e.g. spatial convolution over images).
Keras Model
Create a Keras custom model
layer_activity_regularization
Layer that applies an update to the cost function based input activity.
layer_activation_thresholded_relu
Thresholded Rectified Linear Unit.
Transposed 2D convolution layer (sometimes called Deconvolution).
layer_global_max_pooling_2d
Global max pooling operation for spatial data.
Keras Model composed of a linear stack of layers
Apply an activation function to an output.
Instantiates an all-zeros variable of the same shape as another tensor.
Scaled Exponential Linear Unit.
Softmax activation function.
Constructs a DenseFeatures.
3D convolution layer (e.g. spatial convolution over volumes).
Apply multiplicative 1-centered Gaussian noise.
Apply additive zero-centered Gaussian noise.
Depthwise separable 2D convolution.
Cropping layer for 3D data (e.g. spatial or spatio-temporal).
Fast GRU implementation backed by CuDNN . layer_global_max_pooling_1d
Global max pooling operation for temporal data.
layer_global_average_pooling_3d
Global Average pooling operation for 3D data.
Max pooling operation for 3D data (spatial or spatio-temporal).
layer_global_max_pooling_3d
Global Max pooling operation for 3D data.
Nesterov Adam optimizer
Text tokenization utility
Add a densely-connected NN layer to an output
Fast LSTM implementation backed by CuDNN . Gated Recurrent Unit - Cho et al.
Upsampling layer for 3D inputs.
Zero-padding layer for 2D input (e.g. picture).
Layer that computes the maximum (element-wise) a list of inputs.
Zero-padding layer for 1D input (e.g. temporal sequence).
layer_layer_normalization
Layer normalization layer (Ba et al., 2016).
Wraps arbitrary expression as a layer
Input layer
Layer that multiplies (element-wise) a list of inputs.
Generates a word rank-based probabilistic sampling table.
Permute the dimensions of an input according to a given pattern
Upsampling layer for 1D inputs.
Convert a list of texts to a matrix.
Spatial 3D version of Dropout.
Model loss functions
layer_multi_head_attention
MultiHeadAttention layer
layer_locally_connected_2d
Locally-connected layer for 2D inputs.
Layer that computes the minimum (element-wise) a list of inputs.
Computes the binary crossentropy loss.
layer_locally_connected_1d
Locally-connected layer for 1D inputs.
Spatial 2D version of Dropout.
Upsampling layer for 2D inputs.
Adagrad optimizer.
Assign values to names
Normalize a matrix or nd-array
Zero-padding layer for 3D data (spatial or spatio-temporal).
Adadelta optimizer.
Reshapes an output to a certain shape.
Text vectorization layer
Export to Saved Model format
Repeats the input n times.
Layer that subtracts two inputs.
Objects exported from other packages
Adam optimizer
Replicates a model on different GPUs.
Model performance metrics
plot.keras_training_history
Plot training history
Remove the last layer in a model
Pads sequences to the same length
Pipe operator
Stochastic gradient descent optimizer
Model configuration as YAML
Returns predictions for a single batch of samples.
RMSProp optimizer
Generates probability or class probability predictions for the input samples.
L1 and L2 regularization
Save model weights in the SavedModel format
predict.keras.engine.training.Model
Generate predictions from a Keras model
Save/Load model weights using HDF5 files
Save/Load models using SavedModel format
Save a text tokenizer to an external file
Transform each text in texts in a sequence of integers.
texts_to_sequences_generator
Transforms each text in texts in a sequence of integers.
Utility function for generating batches of temporal data.
Single gradient update or model evaluation over one batch of samples.
Apply a layer to every temporal slice of an input.
Converts a class vector (integers) to binary class matrix.
Sets vocabulary (and optionally document frequency) data for the layer
Generates skipgram word pairs.
Convert text to a sequence of words (or tokens).
Select a Keras implementation and backend
One-hot encode a text into a list of word indexes in a vocabulary of size n.
Generates predictions for the input samples from a data generator.
Convert a list of sequences into a matrix.
Save/Load models using HDF5 files
Reset the states for a layer
summary.keras.engine.training.Model
Print a summary of a Keras model
Provide a scope with mappings of names to custom objects
Converts a text to a sequence of indexes in a fixed-size hashing space.
Serialize a model to an R object