tf.nn.leaky_relu
View source on GitHub |
Compute the Leaky ReLU activation function.
tf.nn.leaky_relu( features, alpha=0.2, name=None )
Args | |
---|---|
features | A Tensor representing preactivation values. Must be one of the following types: float16 , float32 , float64 , int32 , int64 . |
alpha | Slope of the activation function at x < 0. |
name | A name for the operation (optional). |
Returns | |
---|---|
The activation value. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/nn/leaky_relu