With default values, this returns the standard ReLU activation:
max(x, 0), the element-wise maximum of 0 and the input tensor.
Modifying default parameters allows you to use non-zero thresholds, change the max value of the activation, and to use a non-zero multiple of the input for values below the threshold.
Arguments
- x
Input tensor.
- negative_slope
A
numericthat controls the slope for values lower than the threshold.- max_value
A
numericthat sets the saturation threshold (the largest value the function will return).- threshold
A
numericgiving the threshold value of the activation function below which values will be damped or set to zero.
Examples
x <- c(-10, -5, 0, 5, 10)
activation_relu(x)activation_relu(x, negative_slope = 0.5)activation_relu(x, max_value = 5)activation_relu(x, threshold = 5)See also
Other activations: activation_celu() activation_elu() activation_exponential() activation_gelu() activation_glu() activation_hard_shrink() activation_hard_sigmoid() activation_hard_tanh() activation_leaky_relu() activation_linear() activation_log_sigmoid() activation_log_softmax() activation_mish() activation_relu6() activation_selu() activation_sigmoid() activation_silu() activation_soft_shrink() activation_softmax() activation_softplus() activation_softsign() activation_sparse_plus() activation_sparse_sigmoid() activation_sparsemax() activation_squareplus() activation_tanh() activation_tanh_shrink() activation_threshold()