Skip to contents

The CeLU activation function is defined as:

celu(x) = alpha * (exp(x / alpha) - 1) for x < 0,celu(x) = x for x >= 0.

where alpha is a scaling parameter that controls the activation's shape.

Usage

activation_celu(x, alpha = 1)

Arguments

x

Input tensor.

alpha

The value for the CeLU formulation. Defaults to 1.0.

Value

A tensor, the result from applying the activation to the input tensor x.