Versioned name: PReLU-1
Short description: Parametric rectified linear unit element-wise activation function.
Detailed description: PReLU operation is introduced in this article. PReLU performs element-wise parametric ReLU operation on a given input tensor, based on the following mathematical formula:
Description: data_format denotes the data format of the input and output data.
Range of values: NXC or NCX (X means HW for 2D, DHW for 3D)
Default value: NXC
Description: per_channel_broadcast denotes whether to apply per_channel broadcast when slope is 1D tensor.
Range of values: False or True
Default value: True
input- input tensor. Required.
slope- slope tensor. Required.
output- output tensor.
T: f32, f16, bf16
Note: Inputs and outputs have the same data type denoted by T. For example, if input is f32 tensor, then all other tensors have f32 data type.
Only slope tensor supports broadcasting semantics. Slope tensor is uni-directionally broadcasted to input if one of the following rules is met:
1: slope is 1D tensor and per_channel_broadcast is set to True, the length of slope tensor is equal to the length of input of channel dimension.
2: slope is 1D tensor and per_channel_broadcast is set to False, the length of slope tensor is equal to the length of input of the rightmost dimension.
3: slope is nD tensor, starting from the rightmost dimension, ::math::input.shape[i] == slope.shape[i] or ::math::slope.shape[i] == 1 or slope dimension i is empty.