TensorFlow

API

 tf.keras / optimizers / optimizers.Ftrl


Optimizer that implements the NAdam algorithm.

Inherits From: Optimizer

Much like Adam is essentially RMSprop with momentum, Nadam is Adam with Nesterov momentum.

learning_rate A Tensor or a floating point value. The learning rate.
beta_1 A float value or a constant float tensor. The exponential decay rate for the 1st moment estimates.
beta_2 A float value or a constant float tensor. The exponential decay rate for the exponentially weighted infinity norm.
epsilon A small constant for numerical stability.
name Optional name for the operations created when applying gradients. Defaults to "Nadam".
**kwargs Keyword arguments. Allowed to be one of "clipnorm" or "clipvalue". "clipnorm" (float) clips gradients by norm; "clipvalue" (float) clips gradients by value.

Reference:

ValueError in case of any invalid argument.

此页内容是否对您有帮助