tensorflow::ops::Selu
#include <nn_ops.h>
Computes scaled exponential linear: scale * alpha * (exp(features) - 1)
Summary
if < 0, scale * features otherwise.
To be used together with `initializer = tf.variance_scaling_initializer(factor=1.0, mode='FAN_IN'). For correct dropout, usetf.contrib.nn.alpha_dropout`.
See Self-Normalizing Neural Networks
Arguments:
- scope: A Scope object
Returns:
-
Output: The activations tensor.
| Constructors and Destructors | |
|---|---|
Selu(const ::tensorflow::Scope & scope, ::tensorflow::Input features) |
| Public attributes | |
|---|---|
activations | |
operation | |
| Public functions | |
|---|---|
node() const | ::tensorflow::Node * |
operator::tensorflow::Input() const | |
operator::tensorflow::Output() const | |
Public attributes
activations
::tensorflow::Output activations
operation
Operation operation
Public functions
Selu
Selu( const ::tensorflow::Scope & scope, ::tensorflow::Input features )
node
::tensorflow::Node * node() const
operator::tensorflow::Input
operator::tensorflow::Input() const
operator::tensorflow::Output
operator::tensorflow::Output() const
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.3/api_docs/cc/class/tensorflow/ops/selu