vaeacOnly torch::optim_adam() is currently supported. But it is easy to add an additional option later.
vaeac_get_optimizer(vaeac_model, lr, optimizer_name = "adam")A torch::optim_adam() optimizer connected to the parameters of the vaeac_model.
A vaeac model created using vaeac().
Positive numeric (default is 0.001). The learning rate used in the torch::optim_adam() optimizer.
String containing the name of the torch::optimizer() to use.
Lars Henry Berge Olsen