Package deepnetts.net.layers.activation
Class LeakyRelu
java.lang.Object
deepnetts.net.layers.activation.LeakyRelu
- All Implemented Interfaces:
ActivationFunction
,Serializable
,Consumer<TensorBase>
Leaky Rectified Linear Activation and its Derivative.
y = x for x > 0,
0.1 * x for xinvalid input: '<'0
-
| 1, x > 0
y' = invalid input: '<'
| 0.01 , xinvalid input: '<'=0
-
allow a small, positive gradient when the unit is not active
https://ai.stanford.edu/~amaas/papers/relu_hybrid_icml2013_final.pdf
- Author:
- Zoran Sevarac
- See Also:
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionvoid
void
apply
(TensorBase tensor, int from, int to) float
getPrime
(float y) Returns the first derivative of activation function for specified output yfloat
getValue
(float x) Returns the value of activation function for specified input xMethods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface deepnetts.net.layers.activation.ActivationFunction
accept
-
Constructor Details
-
LeakyRelu
public LeakyRelu() -
LeakyRelu
public LeakyRelu(float a)
-
-
Method Details
-
getValue
public float getValue(float x) Description copied from interface:ActivationFunction
Returns the value of activation function for specified input x- Specified by:
getValue
in interfaceActivationFunction
- Parameters:
x
- input for activation- Returns:
- value of activation function
-
getPrime
public float getPrime(float y) Description copied from interface:ActivationFunction
Returns the first derivative of activation function for specified output y- Specified by:
getPrime
in interfaceActivationFunction
- Parameters:
y
- output of activation function- Returns:
- first derivative of activation function
-
apply
- Specified by:
apply
in interfaceActivationFunction
-
apply
- Specified by:
apply
in interfaceActivationFunction
-