Uses of Interface
deepnetts.net.train.opt.Optimizer
Packages that use Optimizer
Package
Description
Neural network layers, which are main building blocks of a neural network.
Optimization methods used by training algorithm.
-
Uses of Optimizer in deepnetts.net.layers
Methods in deepnetts.net.layers that return Optimizer -
Uses of Optimizer in deepnetts.net.train.opt
Classes in deepnetts.net.train.opt that implement OptimizerModifier and TypeClassDescriptionfinal class
Implementation of ADADELTAOptimizer
which is a modification odinvalid reference
AdaGrad
final class
Implementation of ADAGRADOptimizer
, which uses sum of squared previous gradients to adjust a global learning rate for each weight.final class
Implementation of Adam optimizer which is a variation of RmsProp which includes momentum-like factor.final class
Momentum optimization adds momentum parameter to basic Stochastic Gradient Descent, which can accelerate the process.final class
A variation of AdaDelta optimizer.final class
Basic Stochastic Gradient Descent optimization algorithm, which iteratively change weights towards value which gives minimum error.Methods in deepnetts.net.train.opt that return OptimizerModifier and TypeMethodDescriptionstatic Optimizer
Optimizer.create
(OptimizerType type, AbstractLayer layer) Factory method to create different types of optimizers