tf.keras.layers.LeakyReLU
View source on GitHub |
Leaky version of a Rectified Linear Unit.
Inherits From: Layer
tf.keras.layers.LeakyReLU( alpha=0.3, **kwargs )
It allows a small gradient when the unit is not active: f(x) = alpha * x for x < 0
, f(x) = x for x >= 0
.
Input shape:
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.
Output shape:
Same shape as the input.
Arguments | |
---|---|
alpha | Float >= 0. Negative slope coefficient. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/layers/LeakyReLU