API

 torch / nn / torch.nn


RReLU

class torch.nn.RReLU(lower: float = 0.125, upper: float = 0.3333333333333333, inplace: bool = False)[source]

Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:

Empirical Evaluation of Rectified Activations in Convolutional Network.

The function is defined as:

RReLU(x)={xif x0ax otherwise \text{RReLU}(x) = \begin{cases} x & \text{if } x \geq 0 \\ ax & \text{ otherwise } \end{cases}

where aa is randomly sampled from uniform distribution U(lower,upper)\mathcal{U}(\text{lower}, \text{upper}) .

Parameters
  • lower – lower bound of the uniform distribution. Default: 18\frac{1}{8}

  • upper – upper bound of the uniform distribution. Default: 13\frac{1}{3}

  • inplace – can optionally do the operation in-place. Default: False

Shape:
  • Input: (N,)(N, *) where * means, any number of additional dimensions

  • Output: (N,)(N, *) , same shape as the input

Examples:

>>> m = nn.RReLU(0.1, 0.3)
>>> input = torch.randn(2)
>>> output = m(input)

此页内容是否对您有帮助