Search Options

Results per page
Sort
Preferred Languages
Advance

Results 1 - 2 of 2 for pad_delta (0.22 sec)

  1. tensorflow/compiler/mlir/tensorflow/ir/tf_generated_ops.td

    executed.
      }];
    
      let arguments = (ins
        Arg<TF_Float32Tensor, [{Value of parameters used in the Adadelta optimization algorithm.}]>:$parameters,
        Arg<TF_Float32Tensor, [{Value of accumulators used in the Adadelta optimization algorithm.}]>:$accumulators,
        Arg<TF_Float32Tensor, [{Value of updates used in the Adadelta optimization algorithm.}]>:$updates,
    
        DefaultValuedOptionalAttr<I64Attr, "-1">:$table_id,
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 11 23:24:08 UTC 2024
    - 793K bytes
    - Viewed (0)
  2. RELEASE.md

        *   Added `tf.keras.optimizers.experimental.Optimizer`. The reworked
            optimizer gives more control over different phases of optimizer calls,
            and is easier to customize. We provide Adam, SGD, Adadelta, AdaGrad and
            RMSprop optimizers based on
            `tf.keras.optimizers.experimental.Optimizer`. Generally the new
            optimizers work in the same way as the old ones, but support new
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 11 23:24:08 UTC 2024
    - 730.3K bytes
    - Viewed (0)
Back to top