PyTorch
torch / nn / torch.nn
SiLU¶
-
class
torch.nn.
SiLU
(inplace: bool = False)[source]¶ Applies the silu function, element-wise.
silu(x)=x∗σ(x),where σ(x) is the logistic sigmoid.Note
See Gaussian Error Linear Units (GELUs) where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.
- Shape:
Input: (N,∗) where * means, any number of additional dimensions
Output: (N,∗) , same shape as the input
Examples:
>>> m = nn.SiLU() >>> input = torch.randn(2) >>> output = m(input)
此页内容是否对您有帮助
感谢反馈!