tf.keras.activations.selu
View source on GitHub |
Scaled Exponential Linear Unit (SELU).
tf.keras.activations.selu( x )
The Scaled Exponential Linear Unit (SELU) activation function is: scale * x
if x > 0
and scale * alpha * (exp(x) - 1)
if x < 0
where alpha
and scale
are pre-defined constants (alpha = 1.67326324
and scale = 1.05070098
). The SELU activation function multiplies scale
> 1 with the [elu](https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/keras/activations/elu)
(Exponential Linear Unit (ELU)) to ensure a slope larger than one for positive net inputs.
The values of alpha
and scale
are chosen so that the mean and variance of the inputs are preserved between two consecutive layers as long as the weights are initialized correctly (see lecun_normal
initialization) and the number of inputs is "large enough" (see references for more information).
(Courtesy: Blog on Towards DataScience at https://towardsdatascience.com/selu-make-fnns-great-again-snn-8d61526802a9)
Example Usage:
n_classes = 10 #10_class problem model = models.Sequential() model.add(Dense(64, kernel_initializer='lecun_normal', activation='selu', input_shape=(28, 28, 1)))) model.add(Dense(32, kernel_initializer='lecun_normal', activation='selu')) model.add(Dense(16, kernel_initializer='lecun_normal', activation='selu')) model.add(Dense(n_classes, activation='softmax'))
Arguments | |
---|---|
x | A tensor or variable to compute the activation function for. |
Returns | |
---|---|
The scaled exponential unit activation: scale * elu(x, alpha) . |
Note
- To be used together with the initialization "[lecun_normal] (https://www.tensorflow.org/api_docs/python/tf/keras/initializers/lecun_normal)". - To be used together with the dropout variant "[AlphaDropout] (https://www.tensorflow.org/api_docs/python/tf/keras/layers/AlphaDropout)".
References:
Self-Normalizing Neural Networks (Klambauer et al, 2017)
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/activations/selu