Package deepnetts.net.train.opt
Class AdaGradOptimizer
java.lang.Object
deepnetts.net.train.opt.AdaGradOptimizer
- All Implemented Interfaces:
Optimizer,Serializable
Implementation of ADAGRAD
Optimizer , which uses sum of squared previous gradients
to adjust a global learning rate for each weight.- Author:
- Zoran Sevarac
- See Also:
-
Field Summary
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionfloatcalculateDeltaBias(float grad, int idx) floatcalculateDeltaWeight(float grad, int... idxs) voidsetLearningRate(float learningRate)
-
Constructor Details
-
AdaGradOptimizer
-
-
Method Details
-
calculateDeltaWeight
public float calculateDeltaWeight(float grad, int... idxs) - Specified by:
calculateDeltaWeightin interfaceOptimizer
-
calculateDeltaBias
public float calculateDeltaBias(float grad, int idx) - Specified by:
calculateDeltaBiasin interfaceOptimizer
-
setLearningRate
public void setLearningRate(float learningRate) - Specified by:
setLearningRatein interfaceOptimizer
-