Package deepnetts.net.layers.activation
Class Relu
java.lang.Object
deepnetts.net.layers.activation.Relu
- All Implemented Interfaces:
ActivationFunction
,Serializable
,Consumer<TensorBase>
Rectified Linear Activation and its Derivative.
y = max(0, x)
-
| 1, x > 0
y' = invalid input: '<'
| 0, 0xinvalid input: '<'=0
-
- Author:
- Zoran Sevarac
- See Also:
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionvoid
void
apply
(TensorBase tensor, int from, int to) float
getPrime
(float y) Returns the first derivative of activation function for specified output yfloat
getValue
(float x) Returns the value of activation function for specified input xMethods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface deepnetts.net.layers.activation.ActivationFunction
accept
-
Constructor Details
-
Relu
public Relu()
-
-
Method Details
-
getValue
public float getValue(float x) Description copied from interface:ActivationFunction
Returns the value of activation function for specified input x- Specified by:
getValue
in interfaceActivationFunction
- Parameters:
x
- input for activation- Returns:
- value of activation function
-
getPrime
public float getPrime(float y) Description copied from interface:ActivationFunction
Returns the first derivative of activation function for specified output y- Specified by:
getPrime
in interfaceActivationFunction
- Parameters:
y
- output of activation function- Returns:
- first derivative of activation function
-
apply
- Specified by:
apply
in interfaceActivationFunction
-
apply
- Specified by:
apply
in interfaceActivationFunction
-