tf.keras.layers.ThresholdedReLU
View source on GitHub |
Thresholded Rectified Linear Unit.
Inherits From: Layer
tf.keras.layers.ThresholdedReLU( theta=1.0, **kwargs )
It follows:
f(x) = x for x > theta
, f(x) = 0 otherwise
.
Input shape:
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.
Output shape:
Same shape as the input.
Arguments | |
---|---|
theta | Float >= 0. Threshold location of activation. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/layers/ThresholdedReLU