tf.keras.activations.elu
View source on GitHub |
Exponential linear unit.
tf.keras.activations.elu( x, alpha=1.0 )
Arguments | |
---|---|
x | Input tensor. |
alpha | A scalar, slope of negative section. |
Returns | |
---|---|
The exponential linear activation: x if x > 0 and alpha * (exp(x)-1) if x < 0 . |
Reference:
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/activations/elu