tf.contrib.opt.clip_gradients_by_global_norm
Clips gradients of a multitask loss by their global norm.
tf.contrib.opt.clip_gradients_by_global_norm( gradients_variables, clip_norm=20.0 )
Ignores all-zero tensors when computing the global norm.
Args | |
---|---|
gradients_variables | a list of pairs (gradient, variable). |
clip_norm | a float Tensor, the global norm to clip on. Default is 20.0. |
Returns | |
---|---|
list | A list of pairs of the same type as gradients_variables,. |
fixed_global_norm | A 0-D (scalar) Tensor representing the global norm. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/contrib/opt/clip_gradients_by_global_norm