tf.contrib.losses.metric_learning.triplet_semihard_loss
Computes the triplet loss with semi-hard negative mining.
tf.contrib.losses.metric_learning.triplet_semihard_loss( labels, embeddings, margin=1.0 )
The loss encourages the positive distances (between a pair of embeddings with the same labels) to be smaller than the minimum negative distance among which are at least greater than the positive distance plus the margin constant (called semi-hard negative) in the mini-batch. If no such negative exists, uses the largest negative distance instead. See: https://arxiv.org/abs/1503.03832
Args | |
---|---|
labels | 1-D tf.int32 Tensor with shape [batch_size] of multiclass integer labels. |
embeddings | 2-D float Tensor of embedding vectors. Embeddings should be l2 normalized. |
margin | Float, margin term in the loss definition. |
Returns | |
---|---|
triplet_loss | tf.float32 scalar. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/contrib/losses/metric_learning/triplet_semihard_loss