tf.contrib.nn.scaled_softplus
Returns y = alpha * ln(1 + exp(x / alpha)) or min(y, clip).
tf.contrib.nn.scaled_softplus(
x, alpha, clip=None, name=None
)
This can be seen as a softplus applied to the scaled input, with the output appropriately scaled. As alpha tends to 0, scaled_softplus(x, alpha) tends to relu(x). The clipping is optional. As alpha->0, scaled_softplus(x, alpha) tends to relu(x), and scaled_softplus(x, alpha, clip=6) tends to relu6(x).
Note: the gradient for this operation is defined to depend on the backprop inputs as well as the outputs of this operation.
| Args | |
|---|---|
x | A Tensor of inputs. |
alpha | A Tensor, indicating the amount of smoothness. The caller must ensure that alpha > 0. |
clip | (optional) A Tensor, the upper bound to clip the values. |
name | A name for the scope of the operations (optional). |
| Returns | |
|---|---|
| A tensor of the size and type determined by broadcasting of the inputs. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/contrib/nn/scaled_softplus