Search Options

Results per page
Sort
Preferred Languages
Advance

Results 1 - 2 of 2 for aloop (0.26 sec)

  1. RELEASE.md

    *   `tf.distribute`
    
        *   Opened an experimental API, `tf.distribute.experimental.coordinator.get_current_worker_index`, for retrieving the worker index from within a worker, when using parameter server training with a custom training loop.
    
    *   `tf.experimental.dtensor`
    
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 11 23:24:08 UTC 2024
    - 730.3K bytes
    - Viewed (0)
  2. tensorflow/compiler/mlir/tensorflow/ir/tf_generated_ops.td

    preceded by a ConfigureTPUEmbeddingHost op that sets up the correct
    embedding table configuration. For example, this op is used to install
    parameters that are loaded from a checkpoint before a training loop is
    executed.
      }];
    
      let arguments = (ins
        Arg<TF_Float32Tensor, [{Value of parameters used in the ADAM optimization algorithm.}]>:$parameters,
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 11 23:24:08 UTC 2024
    - 793K bytes
    - Viewed (0)
Back to top