
Keras Model composed of a linear stack of layers
keras_model_sequential(layers = NULL, name = NULL, ...)
List of layers to add to the model
Name of model
Arguments passed on to sequential_model_input_layer
input_shape
an integer vector of dimensions (not including the batch
axis), or a tf$TensorShape
instance (also not including the batch axis).
batch_size
Optional input batch size (integer or NULL).
dtype
Optional datatype of the input. When not provided, the Keras default float type will be used.
input_tensor
Optional tensor to use as layer input. If set, the layer
will use the tf$TypeSpec
of this tensor rather than creating a new
placeholder tensor.
sparse
Boolean, whether the placeholder created is meant to be sparse.
Default to FALSE
.
ragged
Boolean, whether the placeholder created is meant to be ragged.
In this case, values of 'NULL' in the 'shape' argument represent ragged
dimensions. For more information about RaggedTensors
, see this
guide. Default to
FALSE
.
type_spec
A tf$TypeSpec
object to create Input from. This
tf$TypeSpec
represents the entire batch. When provided, all other args
except name must be NULL
.
input_layer_name
Optional name of the input layer (string).
Other model functions:
compile.keras.engine.training.Model()
,
evaluate.keras.engine.training.Model()
,
evaluate_generator()
,
fit.keras.engine.training.Model()
,
fit_generator()
,
get_config()
,
get_layer()
,
keras_model()
,
multi_gpu_model()
,
pop_layer()
,
predict.keras.engine.training.Model()
,
predict_generator()
,
predict_on_batch()
,
predict_proba()
,
summary.keras.engine.training.Model()
,
train_on_batch()
# NOT RUN {
library(keras)
model <- keras_model_sequential()
model %>%
layer_dense(units = 32, input_shape = c(784)) %>%
layer_activation('relu') %>%
layer_dense(units = 10) %>%
layer_activation('softmax')
model %>% compile(
optimizer = 'rmsprop',
loss = 'categorical_crossentropy',
metrics = c('accuracy')
)
# alternative way to provide input shape
model <- keras_model_sequential(input_shape = c(784)) %>%
layer_dense(units = 32) %>%
layer_activation('relu') %>%
layer_dense(units = 10) %>%
layer_activation('softmax')
# }
Run the code above in your browser using DataLab