It is defined as:
soft_shrink(x) = x - threshold
if x > threshold
,
soft_shrink(x) = x + threshold
if x < -threshold
,
soft_shrink(x) = 0
otherwise.
See also
Other activations: activation_celu()
activation_elu()
activation_exponential()
activation_gelu()
activation_glu()
activation_hard_shrink()
activation_hard_sigmoid()
activation_hard_tanh()
activation_leaky_relu()
activation_linear()
activation_log_sigmoid()
activation_log_softmax()
activation_mish()
activation_relu()
activation_relu6()
activation_selu()
activation_sigmoid()
activation_silu()
activation_softmax()
activation_softplus()
activation_softsign()
activation_sparse_plus()
activation_sparsemax()
activation_squareplus()
activation_tanh()
activation_tanh_shrink()
activation_threshold()