Learn R Programming

optimg (version 0.1.2)

General-Purpose Gradient-Based Optimization

Description

Provides general purpose tools for helping users to implement steepest gradient descent methods for function optimization; for details see Ruder (2016) . Currently, the Steepest 2-Groups Gradient Descent and the Adaptive Moment Estimation (Adam) are the methods implemented. Other methods will be implemented in the future.

Copy Link

Version

Install

install.packages('optimg')

Monthly Downloads

306

Version

0.1.2

License

GPL-3

Issues

Pull Requests

Stars

Forks

Maintainer

Vithor Franco

Last Published

October 7th, 2021

Functions in optimg (0.1.2)

optimg

General-Purpose Gradient-Based Optimization