# Computes Kullback-Leibler divergence loss between `y_true`

& `y_pred`

.

Source: `R/losses.R`

`loss_kl_divergence.Rd`

Formula:

`loss <- y_true * log(y_true / y_pred)`

`y_true`

and `y_pred`

are expected to be probability
distributions, with values between 0 and 1. They will get
clipped to the `[0, 1]`

range.

## Usage

```
loss_kl_divergence(
y_true,
y_pred,
...,
reduction = "sum_over_batch_size",
name = "kl_divergence"
)
```

## Arguments

- y_true
Tensor of true targets.

- y_pred
Tensor of predicted targets.

- ...
For forward/backward compatability.

- reduction
Type of reduction to apply to the loss. In almost all cases this should be

`"sum_over_batch_size"`

. Supported options are`"sum"`

,`"sum_over_batch_size"`

or`NULL`

.- name
Optional name for the loss instance.

## Examples

```
y_true <- random_uniform(c(2, 3), 0, 2)
y_pred <- random_uniform(c(2,3))
loss <- loss_kl_divergence(y_true, y_pred)
loss
```

## See also

Other losses: `Loss()`

`loss_binary_crossentropy()`

`loss_binary_focal_crossentropy()`

`loss_categorical_crossentropy()`

`loss_categorical_focal_crossentropy()`

`loss_categorical_hinge()`

`loss_cosine_similarity()`

`loss_ctc()`

`loss_dice()`

`loss_hinge()`

`loss_huber()`

`loss_log_cosh()`

`loss_mean_absolute_error()`

`loss_mean_absolute_percentage_error()`

`loss_mean_squared_error()`

`loss_mean_squared_logarithmic_error()`

`loss_poisson()`

`loss_sparse_categorical_crossentropy()`

`loss_squared_hinge()`

`loss_tversky()`

`metric_binary_crossentropy()`

`metric_binary_focal_crossentropy()`

`metric_categorical_crossentropy()`

`metric_categorical_focal_crossentropy()`

`metric_categorical_hinge()`

`metric_hinge()`

`metric_huber()`

`metric_kl_divergence()`

`metric_log_cosh()`

`metric_mean_absolute_error()`

`metric_mean_absolute_percentage_error()`

`metric_mean_squared_error()`

`metric_mean_squared_logarithmic_error()`

`metric_poisson()`

`metric_sparse_categorical_crossentropy()`

`metric_squared_hinge()`