PyTorch
torch / nn / torch.nn
CELU¶
-
class
torch.nn.
CELU
(alpha: float = 1.0, inplace: bool = False)[source]¶ Applies the element-wise function:
CELU(x)=max(0,x)+min(0,α∗(exp(x/α)−1))More details can be found in the paper Continuously Differentiable Exponential Linear Units .
- Parameters
alpha – the α value for the CELU formulation. Default: 1.0
inplace – can optionally do the operation in-place. Default:
False
- Shape:
Input: (N,∗) where * means, any number of additional dimensions
Output: (N,∗) , same shape as the input
Examples:
>>> m = nn.CELU() >>> input = torch.randn(2) >>> output = m(input)
此页内容是否对您有帮助
感谢反馈!