Interface ActivationFunction

All Superinterfaces:
Consumer<TensorBase>
All Known Implementing Classes:
LeakyRelu, Linear, Relu, Sigmoid, SoftSign, Tanh

public interface ActivationFunction extends Consumer<TensorBase>
Common base interface for all activation functions used in layers. Classes implementing this interface should provide methods for calculating value and first derivative of the activation function. Activation function performs non-linear transformation of its input before its sent to layer output. First derivative of a function shows how fast and in what direction function is changing if its input changes, and it is used by training algorithm. For more see https://en.wikipedia.org/wiki/Activation_function
See Also: