torch.nn.utils.clip_grad_norm_
-
torch.nn.utils.clip_grad_norm_(parameters, max_norm, norm_type=2.0)
[source] -
Clips gradient norm of an iterable of parameters.
The norm is computed over all gradients together, as if they were concatenated into a single vector. Gradients are modified in-place.
- Parameters
- Returns
-
Total norm of the parameters (viewed as a single vector).
© 2019 Torch Contributors
Licensed under the 3-clause BSD License.
https://pytorch.org/docs/1.8.0/generated/torch.nn.utils.clip_grad_norm_.html