PyTorch
torch / nn / torch.nn
LeakyReLU¶
-
class
torch.nn.
LeakyReLU
(negative_slope: float = 0.01, inplace: bool = False)[source]¶ Applies the element-wise function:
LeakyReLU(x)=max(0,x)+negative_slope∗min(0,x)or
LeakyRELU(x)={x,negative_slope×x, if x≥0 otherwise - Parameters
negative_slope – Controls the angle of the negative slope. Default: 1e-2
inplace – can optionally do the operation in-place. Default:
False
- Shape:
Input: (N,∗) where * means, any number of additional dimensions
Output: (N,∗) , same shape as the input
Examples:
>>> m = nn.LeakyReLU(0.1) >>> input = torch.randn(2) >>> output = m(input)
此页内容是否对您有帮助
感谢反馈!