It is defined as
f(x) = 0 for x <= -1,
f(x) = 0.5 * (x + 1) for -1 < x < 1,
f(x) = 1 for x >= 1.
See also
Other activations: activation_celu() activation_elu() activation_exponential() activation_gelu() activation_glu() activation_hard_shrink() activation_hard_sigmoid() activation_hard_tanh() activation_leaky_relu() activation_linear() activation_log_sigmoid() activation_log_softmax() activation_mish() activation_relu() activation_relu6() activation_selu() activation_sigmoid() activation_silu() activation_soft_shrink() activation_softmax() activation_softplus() activation_softsign() activation_sparse_plus() activation_sparsemax() activation_squareplus() activation_tanh() activation_tanh_shrink() activation_threshold()