Search Options

Results per page
Sort
Preferred Languages
Advance

Results 1 - 2 of 2 for sgd (0.12 sec)

  1. tensorflow/compiler/mlir/tensorflow/ir/tf_generated_ops.td

      let results = (outs);
    }
    
    def TF_LoadTPUEmbeddingStochasticGradientDescentParametersOp : TF_Op<"LoadTPUEmbeddingStochasticGradientDescentParameters", [TF_MustExecute, TF_TPUEmbeddingReadEffect]> {
      let summary = "Load SGD embedding parameters.";
    
      let description = [{
    An op that loads optimization parameters into HBM for embedding. Must be
    preceded by a ConfigureTPUEmbeddingHost op that sets up the correct
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 11 23:24:08 UTC 2024
    - 793K bytes
    - Viewed (0)
  2. RELEASE.md

        *   Added `tf.keras.optimizers.experimental.Optimizer`. The reworked
            optimizer gives more control over different phases of optimizer calls,
            and is easier to customize. We provide Adam, SGD, Adadelta, AdaGrad and
            RMSprop optimizers based on
            `tf.keras.optimizers.experimental.Optimizer`. Generally the new
            optimizers work in the same way as the old ones, but support new
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 11 23:24:08 UTC 2024
    - 730.3K bytes
    - Viewed (0)
Back to top