Search Options

Results per page
Sort
Preferred Languages
Advance

Results 1 - 2 of 2 for softmax_cross_entropy_with_logits (0.28 sec)

  1. tensorflow/compiler/mlir/tfr/examples/mnist/mnist_train.py

        labels = tf.one_hot(features['label'], num_classes)
    
        with tf.GradientTape() as tape:
          logits = model(inputs)
          loss_value = tf.reduce_mean(
              tf.nn.softmax_cross_entropy_with_logits(labels, logits))
    
        grads = tape.gradient(loss_value, model.trainable_variables)
        correct_prediction = tf.equal(tf.argmax(logits, 1), tf.argmax(labels, 1))
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Wed Oct 20 03:05:18 UTC 2021
    - 6.5K bytes
    - Viewed (0)
  2. RELEASE.md

            minutes).
    *   Deterministic Op Functionality (enabled by setting `TF_DETERMINISTIC_OPS` to
        `"true"` or `"1"`):
        *   Add a deterministic GPU implementation of
            `tf.nn.softmax_cross_entropy_with_logits`. See PR
            [49178](https://github.com/tensorflow/tensorflow/pull/49178).
        *   Add a deterministic CPU implementation of `tf.image.crop_and_resize`.
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 11 23:24:08 UTC 2024
    - 730.3K bytes
    - Viewed (0)
Back to top