tf.raw_ops.SymbolicGradient
Computes the gradient function for function f via backpropagation.
tf.raw_ops.SymbolicGradient( input, Tout, f, name=None )
Args | |
---|---|
input | A list of Tensor objects. a list of input tensors of size N + M; |
Tout | A list of tf.DTypes that has length >= 1 . the type list for the input list. |
f | A function decorated with @Defun. The function we want to compute the gradient for. The function 'f' must be a numerical function which takes N inputs and produces M outputs. Its gradient function 'g', which is computed by this SymbolicGradient op is a function taking N + M inputs and produces N outputs. I.e. if we have (y1, y2, ..., y_M) = f(x1, x2, ..., x_N), then, g is (dL/dx1, dL/dx2, ..., dL/dx_N) = g(x1, x2, ..., x_N, dL/dy1, dL/dy2, ..., dL/dy_M), where L is a scalar-value function of (x1, x2, ..., xN) (e.g., the loss function). dL/dx_i is the partial derivative of L with respect to x_i. (Needs some math expert to say the comment above better.) |
name | A name for the operation (optional). |
Returns | |
---|---|
A list of Tensor objects of type Tout . |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.4/api_docs/python/tf/raw_ops/SymbolicGradient