Skip to contents

The GLU activation function is defined as:

glu(x) = a * sigmoid(b),

where x is split into two equal parts a and b along the given axis.

Usage

activation_glu(x, axis = -1L)

Arguments

x

Input tensor.

axis

The axis along which to split the input tensor. Defaults to -1.

Value

A tensor, the result from applying the activation to the input tensor x.