torch / nn / torch.nn
RReLU¶
-
class
torch.nn.
RReLU
(lower: float = 0.125, upper: float = 0.3333333333333333, inplace: bool = False)[source]¶ Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:
Empirical Evaluation of Rectified Activations in Convolutional Network.
The function is defined as:
where is randomly sampled from uniform distribution .
- Parameters
lower – lower bound of the uniform distribution. Default:
upper – upper bound of the uniform distribution. Default:
inplace – can optionally do the operation in-place. Default:
False
- Shape:
Input: where * means, any number of additional dimensions
Output: , same shape as the input
Examples:
>>> m = nn.RReLU(0.1, 0.3) >>> input = torch.randn(2) >>> output = m(input)
此页内容是否对您有帮助