Package deepnetts.net.train.opt
package deepnetts.net.train.opt
Optimization methods used by training algorithm.
-
ClassDescriptionSkeletal implementation of the Optimizer interface to minimize effort to implement specific optimizers.Implementation of ADADELTA
Optimizer
which is a modification odinvalid reference
AdaGrad
Implementation of ADAGRADOptimizer
, which uses sum of squared previous gradients to adjust a global learning rate for each weight.Implementation of Adam optimizer which is a variation of RmsProp which includes momentum-like factor.https://www.coursera.org/learn/deep-neural-network/lecture/hjgIA/learning-rate-decayMomentum optimization adds momentum parameter to basic Stochastic Gradient Descent, which can accelerate the process.Optimization technique to tune network's weights parameters used by training algorithm.Supported types of optimization methods used by back-propagation training algorithm.A variation of AdaDelta optimizer.Basic Stochastic Gradient Descent optimization algorithm, which iteratively change weights towards value which gives minimum error.