Search Options

Results per page
Sort
Preferred Languages
Advance

Results 1 - 2 of 2 for experimental_distribute_dataset (0.25 sec)

  1. tensorflow/compiler/mlir/tfr/examples/mnist/mnist_train.py

      ds_train = tfds.load('mnist', split='train', shuffle_files=True)
      ds_train = ds_train.shuffle(1024).batch(batch_size).prefetch(64)
      ds_train = strategy.experimental_distribute_dataset(ds_train)
    
      with strategy.scope():
        # Create an mnist float model with the specified float state.
        model = FloatModel()
        optimizer = tf.keras.optimizers.Adam(learning_rate=learning_rate)
    
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Wed Oct 20 03:05:18 UTC 2021
    - 6.5K bytes
    - Viewed (0)
  2. RELEASE.md

            causes of issues involving infinities and `NaN`s.
    *   `tf.distribute`
        *   Custom training loop support on TPUs and TPU pods is available through
            `strategy.experimental_distribute_dataset`,
            `strategy.experimental_distribute_datasets_from_function`,
            `strategy.experimental_run_v2`, `strategy.reduce`.
        *   Support for a global distribution strategy through
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 11 23:24:08 UTC 2024
    - 730.3K bytes
    - Viewed (0)
Back to top