TensorFlow

API

 tf.nn / safe_embedding_lookup_sparse


Scales the sum of the given regularization losses by number of replicas.

Usage with distribution strategy and custom training loop:

with strategy.scope():
  def compute_loss(self, label, predictions):
    per_example_loss = tf.keras.losses.sparse_categorical_crossentropy(
        labels, predictions)

    # Compute loss that is scaled by sample_weight and by global batch size.
    loss = tf.nn.compute_average_loss(
        per_example_loss,
        sample_weight=sample_weight,
        global_batch_size=GLOBAL_BATCH_SIZE)

    # Add scaled regularization losses.
    loss += tf.nn.scale_regularization_loss(tf.nn.l2_loss(weights))
    return loss

regularization_loss Regularization loss.

Scalar loss value.

此页内容是否对您有帮助