Class Relu

java.lang.Object
deepnetts.net.layers.activation.Relu
All Implemented Interfaces:
ActivationFunction, Serializable, Consumer<TensorBase>

public final class Relu extends Object implements ActivationFunction, Serializable
Rectified Linear Activation and its Derivative. y = max(0, x) - | 1, x > 0 y' = invalid input: '<' | 0, 0xinvalid input: '<'=0 -
Author:
Zoran Sevarac
See Also: