ReLU#

Versioned name: ReLU-1

Category: Activation

Short description: Reference

OpenVINO description: This OP is as same as OpenVINO OP

Detailed description: Reference

Attributes: ReLU operation has no attributes.

Mathematical Formulation

\[Y_{i}^{( l )} = max(0, Y_{i}^{( l - 1 )})\]

Inputs:

  • 1: input - multidimensional input tensor. Required.

    • Type: T

Outputs

  • 1: output - the output tensor.

    • Type: T

Types:

  • T: f32, f16, bf16.

  • Note: Inputs and outputs have the same data type denoted by T. For example, if input is f32 tensor, then all other tensors have f32 data type.