tf.keras.losses.KLD
| View source on GitHub |
Computes Kullback-Leibler divergence loss between y_true and y_pred.
tf.keras.losses.KLD(
y_true, y_pred
)
loss = y_true * log(y_true / y_pred)
See: https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence
Usage:
loss = tf.keras.losses.KLD([.4, .9, .2], [.5, .8, .12])
print('Loss: ', loss.numpy()) # Loss: 0.11891246
| Args | |
|---|---|
y_true | Tensor of true targets. |
y_pred | Tensor of predicted targets. |
| Returns | |
|---|---|
A Tensor with loss. |
| Raises | |
|---|---|
TypeError | If y_true cannot be cast to the y_pred.dtype. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/losses/KLD