Calculates the Concordance Correlation Coefficient (CCC).
Source:R/metrics.R
metric_concordance_correlation.Rd
Formula:
loss <- mean(
2 * (y_true - mean(y_true)) * (y_pred - mean(y_pred)) /
(var(y_true) + var(y_pred) + (mean(y_true) - mean(y_pred))^2)
)
CCC evaluates the agreement between true values (y_true
) and predicted
values (y_pred
) by considering both precision and accuracy. The
coefficient ranges from -1 to 1, where a value of 1 indicates perfect
agreement.
This metric is useful in regression tasks where it is important to assess how well the predictions match the true values, taking into account both their correlation and proximity to the 45-degree line of perfect concordance.
Usage
metric_concordance_correlation(
y_true,
y_pred,
axis = -1L,
...,
name = "concordance_correlation",
dtype = NULL
)
Arguments
- y_true
Tensor of true targets.
- y_pred
Tensor of predicted targets.
- axis
(Optional) integer or tuple of integers of the axis/axes along which to compute the metric. Defaults to
-1
.- ...
For forward/backward compatability.
- name
(Optional) string name of the metric instance.
- dtype
(Optional) data type of the metric result.
Examples
ccc <- metric_concordance_correlation(axis=-1)
y_true <- rbind(c(0, 1, 0.5),
c(1, 1, 0.2))
y_pred <- rbind(c(0.1, 0.9, 0.5),
c(1, 0.9, 0.2))
ccc$update_state(y_true, y_pred)
ccc$result()
Usage with compile()
API:
model |> compile(
optimizer = 'sgd',
loss = 'mean_squared_error',
metrics = c(metric_concordance_correlation())
)
See also
Other regression metrics: metric_cosine_similarity()
metric_log_cosh_error()
metric_mean_absolute_error()
metric_mean_absolute_percentage_error()
metric_mean_squared_error()
metric_mean_squared_logarithmic_error()
metric_pearson_correlation()
metric_r2_score()
metric_root_mean_squared_error()
Other metrics: Metric()
custom_metric()
metric_auc()
metric_binary_accuracy()
metric_binary_crossentropy()
metric_binary_focal_crossentropy()
metric_binary_iou()
metric_categorical_accuracy()
metric_categorical_crossentropy()
metric_categorical_focal_crossentropy()
metric_categorical_hinge()
metric_cosine_similarity()
metric_f1_score()
metric_false_negatives()
metric_false_positives()
metric_fbeta_score()
metric_hinge()
metric_huber()
metric_iou()
metric_kl_divergence()
metric_log_cosh()
metric_log_cosh_error()
metric_mean()
metric_mean_absolute_error()
metric_mean_absolute_percentage_error()
metric_mean_iou()
metric_mean_squared_error()
metric_mean_squared_logarithmic_error()
metric_mean_wrapper()
metric_one_hot_iou()
metric_one_hot_mean_iou()
metric_pearson_correlation()
metric_poisson()
metric_precision()
metric_precision_at_recall()
metric_r2_score()
metric_recall()
metric_recall_at_precision()
metric_root_mean_squared_error()
metric_sensitivity_at_specificity()
metric_sparse_categorical_accuracy()
metric_sparse_categorical_crossentropy()
metric_sparse_top_k_categorical_accuracy()
metric_specificity_at_sensitivity()
metric_squared_hinge()
metric_sum()
metric_top_k_categorical_accuracy()
metric_true_negatives()
metric_true_positives()