Versioned name: EluBackprop-1

Category: Activation

Short description: EluBackprop computes gradient for ELU.


  • alpha

    • Description: alpha is scale for the negative factor.

    • Range of values: arbitrary non-negative f32 value

    • Type: f32

    • Required: yes

  • use_dst

    • Description: If true, use dst to calculate gradient; else use src.

    • Range of values: True or False

    • Type: bool

    • Default value: True

    • Required: no


  • 1: result_forward / input_forward - if use_dst is true, result_forward is used, else input_forward is used.. Required.

    • Type: T

  • 2: output_delta - the gradient tensor with respect to the output of Elu. Required.

    • Type: T


  • 1: input_delta - the gradient tensor with respect to the input of ELU.

    • Type: T


  • T: f32, f16, bf16.

  • Note: Inputs and outputs have the same data type denoted by T. For example, if input is f32 tensor, then all other tensors have f32 data type.