giagrad.optim#

Optimizers#

Adadelta

Implements Adadelta algorithm.

Adam

Implements Adam algorithm.

Adamax

Implements Adamax algorithm (a variant of Adam based on infinity norm).

SGD

Implements stochastic gradient descent (optionally with momentum).