Package deepnetts.net.train.opt
Interface Optimizer
- All Known Implementing Classes:
AdaDeltaOptimizer
,AdaGradOptimizer
,AdamOptimizer
,MomentumOptimizer
,RmsPropOptimizer
,SgdOptimizer
public interface Optimizer
Optimization technique to tune network's weights parameters used by training algorithm.
-
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final int
static final int
static final float
Smoothing term to prevent division by zero if sqr grad sum becomes zero 1e-8 should be also tried https://d2l.ai/chapter_optimization/adagrad.html 1e-6 The value to use is 1e-6, 1e-8, Keras uses 1e-7 for adamstatic final int
static final int
-
Method Summary
Modifier and TypeMethodDescriptionfloat
calculateDeltaBias
(float gradient, int idx) float
calculateDeltaWeight
(float gradient, int... index) static Optimizer
create
(OptimizerType type, AbstractLayer layer) Factory method to create different types of optimizersvoid
setLearningRate
(float learningRate)
-
Field Details
-
EPS
static final float EPSSmoothing term to prevent division by zero if sqr grad sum becomes zero 1e-8 should be also tried https://d2l.ai/chapter_optimization/adagrad.html 1e-6 The value to use is 1e-6, 1e-8, Keras uses 1e-7 for adam- See Also:
-
ROW_IDX
static final int ROW_IDX- See Also:
-
COL_IDX
static final int COL_IDX- See Also:
-
DEPTH_IDX
static final int DEPTH_IDX- See Also:
-
FOURTH_DIM_IDX
static final int FOURTH_DIM_IDX- See Also:
-
-
Method Details
-
calculateDeltaWeight
float calculateDeltaWeight(float gradient, int... index) -
calculateDeltaBias
float calculateDeltaBias(float gradient, int idx) -
setLearningRate
void setLearningRate(float learningRate) -
create
Factory method to create different types of optimizers- Parameters:
type
-layer
-- Returns:
-