Package deepnetts.net.train.opt
Enum Class OptimizerType
- All Implemented Interfaces:
Serializable
,Comparable<OptimizerType>
,Constable
Supported types of optimization methods used by back-propagation training algorithm.
For more info see: http://ruder.io/optimizing-gradient-descent/index.html
-
Nested Class Summary
Nested classes/interfaces inherited from class java.lang.Enum
Enum.EnumDesc<E extends Enum<E>>
-
Enum Constant Summary
Enum Constants -
Method Summary
Modifier and TypeMethodDescriptionstatic OptimizerType
Returns the enum constant of this class with the specified name.static OptimizerType[]
values()
Returns an array containing the constants of this enum class, in the order they are declared.Methods inherited from class java.lang.Enum
compareTo, describeConstable, equals, getDeclaringClass, hashCode, name, ordinal, toString, valueOf
-
Enum Constant Details
-
SGD
Stochastic Gradient Descent, a basic type of neural network optimization algorithm. -
MOMENTUM
-
ADAGRAD
-
RMSPROP
-
ADADELTA
-
ADAM
-
-
Method Details
-
values
Returns an array containing the constants of this enum class, in the order they are declared.- Returns:
- an array containing the constants of this enum class, in the order they are declared
-
valueOf
Returns the enum constant of this class with the specified name. The string must match exactly an identifier used to declare an enum constant in this class. (Extraneous whitespace characters are not permitted.)- Parameters:
name
- the name of the enum constant to be returned.- Returns:
- the enum constant with the specified name
- Throws:
IllegalArgumentException
- if this enum class has no constant with the specified nameNullPointerException
- if the argument is null
-