Module: tf.metrics
Evaluation-related metrics.
Functions
accuracy(...)
: Calculates how often predictions
matches labels
.
auc(...)
: Computes the approximate AUC via a Riemann sum.
average_precision_at_k(...)
: Computes average precision@k of predictions with respect to sparse labels.
false_negatives(...)
: Computes the total number of false negatives.
false_negatives_at_thresholds(...)
: Computes false negatives at provided threshold values.
false_positives(...)
: Sum the weights of false positives.
false_positives_at_thresholds(...)
: Computes false positives at provided threshold values.
mean(...)
: Computes the (weighted) mean of the given values.
mean_absolute_error(...)
: Computes the mean absolute error between the labels and predictions.
mean_cosine_distance(...)
: Computes the cosine distance between the labels and predictions.
mean_iou(...)
: Calculate per-step mean Intersection-Over-Union (mIOU).
mean_per_class_accuracy(...)
: Calculates the mean of the per-class accuracies.
mean_relative_error(...)
: Computes the mean relative error by normalizing with the given values.
mean_squared_error(...)
: Computes the mean squared error between the labels and predictions.
mean_tensor(...)
: Computes the element-wise (weighted) mean of the given tensors.
percentage_below(...)
: Computes the percentage of values less than the given threshold.
precision(...)
: Computes the precision of the predictions with respect to the labels.
precision_at_k(...)
: Computes precision@k of the predictions with respect to sparse labels.
precision_at_thresholds(...)
: Computes precision values for different thresholds
on predictions
.
precision_at_top_k(...)
: Computes precision@k of the predictions with respect to sparse labels.
recall(...)
: Computes the recall of the predictions with respect to the labels.
recall_at_k(...)
: Computes recall@k of the predictions with respect to sparse labels.
recall_at_thresholds(...)
: Computes various recall values for different thresholds
on predictions
.
recall_at_top_k(...)
: Computes recall@k of top-k predictions with respect to sparse labels.
root_mean_squared_error(...)
: Computes the root mean squared error between the labels and predictions.
sensitivity_at_specificity(...)
: Computes the specificity at a given sensitivity.
sparse_average_precision_at_k(...)
: Renamed to average_precision_at_k
, please use that method instead. (deprecated)
sparse_precision_at_k(...)
: Renamed to precision_at_k
, please use that method instead. (deprecated)
specificity_at_sensitivity(...)
: Computes the specificity at a given sensitivity.
true_negatives(...)
: Sum the weights of true_negatives.
true_negatives_at_thresholds(...)
: Computes true negatives at provided threshold values.
true_positives(...)
: Sum the weights of true_positives.
true_positives_at_thresholds(...)
: Computes true positives at provided threshold values.
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/metrics