Versioned name: LeakyReLU-1

Category: Activation

Short description: Leaky Rectified Linear Unit.

Detailed description: LeakyReLU is a type of activation function based on ReLU. It has a small slope for negative values with which LeakyReLU can produce small, non-zero, and constant gradients with respect to the negative values. The slope is also called the coefficient of leakage.

Unlike PReLU, the coefficient alpha is constant and defined before training.

\[\begin{split}LeakyReLU(x) = \left\{\begin{array}{r} x \quad \mbox{if } x \geq 0 \\ \alpha x \quad \mbox{if } x < 0 \end{array}\right.\end{split}\]


  • alpha

    • Description: alpha is the coefficient of leakage.

    • Range of values: arbitrary f32 value but usually a small positive value.

    • Type: f32

    • Required: yes


  • 1: input - multidimensional input tensor. Required.

    • Type: T


  • 1: output - multidimensional output tensor which has the same data type and shape as the input tensor.

    • Type: T


  • T: f32, f16, bf16.

  • Note: Inputs and outputs have the same data type denoted by T. For example, if input is f32 tensor, then all other tensors have f32 data type.