Class LeakyRelu

java.lang.Object
deepnetts.net.layers.activation.LeakyRelu
All Implemented Interfaces:
ActivationFunction, Serializable, Consumer<TensorBase>

public final class LeakyRelu extends Object implements ActivationFunction, Serializable
Leaky Rectified Linear Activation and its Derivative. y = x for x > 0, 0.1 * x for xinvalid input: '<'0 - | 1, x > 0 y' = invalid input: '<' | 0.01 , xinvalid input: '<'=0 - allow a small, positive gradient when the unit is not active https://ai.stanford.edu/~amaas/papers/relu_hybrid_icml2013_final.pdf
Author:
Zoran Sevarac
See Also: