GELU#

Versioned name: GELU-1

Category: Activation

Short description: Reference

OpenVINO description: This OP is as same as OpenVINO OP

Detailed description: Reference

Attributes: GELU operation has no attributes.

Mathematical Formulation \(GELU(x)=x*Φ(x)\), where \(Φ(x)\) is the Cumulative Distribution Function for Gaussian Distribution.

\[GELU(x) = 0.5*x*(1.0 + erf((x) / \sqrt{2})\]

The following approximation calculation (typical for the TensorFlow*) is implementation defined behavior.

\[GELU(x) \approx 0.5x(1.0 + tanh(\sqrt{2.0/pi} * (x + 0.044715 * x ^ 3))\]

Inputs:

  • 1: input - multidimensional input tensor. Required.

    • Type: T

Outputs

  • 1: output - result of GELU function applied to the input tensor. Required.

    • Type: T

Types:

  • T: f32, f16, bf16.

  • Note: Inputs and outputs have the same data type denoted by T. For example, if input is f32 tensor, then all other tensors have f32 data type.