Learn R Programming

torchopt (version 0.1.4)

Advanced Optimizers for Torch

Description

Optimizers for 'torch' deep learning library. These functions include recent results published in the literature and are not part of the optimizers offered in 'torch'. Prospective users should test these optimizers with their data, since performance depends on the specific problem being solved. The packages includes the following optimizers: (a) 'adabelief' by Zhuang et al (2020), ; (b) 'adabound' by Luo et al.(2019), ; (c) 'adahessian' by Yao et al.(2021) ; (d) 'adamw' by Loshchilov & Hutter (2019), ; (e) 'madgrad' by Defazio and Jelassi (2021), ; (f) 'nadam' by Dozat (2019), ; (g) 'qhadam' by Ma and Yarats(2019), ; (h) 'radam' by Liu et al. (2019), ; (i) 'swats' by Shekar and Sochee (2018), ; (j) 'yogi' by Zaheer et al.(2019), .

Copy Link

Version

Install

install.packages('torchopt')

Monthly Downloads

148

Version

0.1.4

License

Apache License (>= 2)

Issues

Pull Requests

Stars

Forks

Maintainer

Gilberto Camara

Last Published

June 6th, 2023

Functions in torchopt (0.1.4)

test_optim

Test optimization function
torchopt-package

torchopt: Advanced Optimizers for Torch
state<-

Imported function
state

Imported function
optim_adabound

Adabound optimizer
optim_swats

SWATS optimizer
optim_madgrad

MADGRAD optimizer
optim_qhadam

QHAdam optimization algorithm
optim_adahessian

Adahessian optimizer
optim_yogi

Yogi optimizer
optim_radam

AdamW optimizer
optim_adabelief

Adabelief optimizer
optim_adamw

AdamW optimizer
optim_nadam

Nadam optimizer