PyTorch
torch / nn / torch.nn
SELU¶
-
class
torch.nn.
SELU
(inplace: bool = False)[source]¶ Applied element-wise, as:
SELU(x)=scale∗(max(0,x)+min(0,α∗(exp(x)−1)))with α=1.6732632423543772848170429916717 and scale=1.0507009873554804934193349852946 .
More details can be found in the paper Self-Normalizing Neural Networks .
- Parameters
inplace (bool, optional) – can optionally do the operation in-place. Default:
False
- Shape:
Input: (N,∗) where * means, any number of additional dimensions
Output: (N,∗) , same shape as the input
Examples:
>>> m = nn.SELU() >>> input = torch.randn(2) >>> output = m(input)
此页内容是否对您有帮助
感谢反馈!