Learn R Programming

RBERT (version 0.1.11)

create_optimizer: Create an optimizer training op

Description

create_optimizer doesn't actually return the optimizer object; it returns the operation resulting from a tf.group() call.

Usage

create_optimizer(loss, init_lr, num_train_steps, num_warmup_steps, use_tpu)

Arguments

loss

Float Tensor; the loss for this step (calculated elsewhere; in principle is a function of trainable parameter values).

init_lr

Numeric; initial learning rate.

num_train_steps

Integer; number of steps to train for.

num_warmup_steps

Integer; number of steps to use for "warm-up".

use_tpu

Logical; whether to use TPU.

Value

A training op: the result of a tensorflow group() of operations.

Details

See also:

https://www.tensorflow.org/api_docs/python/tf/group

https://stackoverflow.com/questions/41780655/what-is-the-difference-between-tf-group-and-tf-control-dependencies

The routine tf.gradients() is called in the course of this function. https://www.tensorflow.org/api_docs/python/tf/gradients

Examples

Run this code
# NOT RUN {
with(tensorflow::tf$variable_scope("examples",
  reuse = tensorflow::tf$AUTO_REUSE
), {
  totrain <- tensorflow::tf$get_variable(
    "totrain",
    tensorflow::shape(10L, 20L)
  )
  loss <- 2 * totrain

  train_op <- create_optimizer(
    loss = loss,
    init_lr = 0.01,
    num_train_steps = 20L,
    num_warmup_steps = 10L,
    use_tpu = FALSE
  )
})
# }

Run the code above in your browser using DataLab