Search Options

Results per page
Sort
Preferred Languages
Advance

Results 1 - 5 of 5 for _backprop (0.35 sec)

  1. tensorflow/compiler/mlir/tensorflow/transforms/lower_tf.td

    // computes loss and backprop of the loss with respect to 'features'.
    //
    // Softmax cross entropy loss is defined as follows:
    //
    //  loss = Sum(-labels * Log(Exp(features) / Sum(Exp(features)))
    //  loss = Sum(-labels * LogSoftmax(features))
    //
    // Computing gradient of the loss with respect to features gives us,
    //
    //  backprop = (Exp(features) / Sum(Exp(features))) - labels
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 04 13:30:42 UTC 2024
    - 24.7K bytes
    - Viewed (0)
  2. tensorflow/compiler/mlir/tf2xla/transforms/legalize_tf.cc

          Value scratch2 =
              ApplyReduction(loc, weighted_grad, reduce_dims, &rewriter);
    
          // x_backprop = y_backprop * (scale * scratch1)
          auto scaled_grad =
              rewriter.create<mhlo::MulOp>(loc, op.getScale(), scratch1);
          x_backprop = rewriter.create<mhlo::MulOp>(
              loc, grad,
              Broadcast1DToFeatureDim(loc, act, scaled_grad, feature_dim,
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 11 20:00:43 UTC 2024
    - 291.8K bytes
    - Viewed (0)
  3. tensorflow/compiler/mlir/tensorflow/ir/tf_generated_ops.td

    example, suppose y = f(x) and we wish to apply a custom function g for backprop
    such that dx = g(dy). In Python,
    
    ```python
    with tf.get_default_graph().gradient_override_map(
        {'IdentityN': 'OverrideGradientWithG'}):
      y, _ = identity_n([f(x), x])
    
    @tf.RegisterGradient('OverrideGradientWithG')
    def ApplyG(op, dy, _):
      return [None, g(dy)]  # Do not backprop to f(x).
    ```
      }];
    
      let arguments = (ins
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 11 23:24:08 UTC 2024
    - 793K bytes
    - Viewed (0)
  4. RELEASE.md

            *   `tf.compat.v1.nn.fused_batch_norm` backprop to `offset` when
                `is_training=False`
            *   `tf.image.adjust_contrast` forward
            *   `tf.nn.depthwise_conv2d` backprop to `filter` when not using cuDNN
                convolution
            *   `tf.image.resize` with `method=ResizeMethod.NEAREST` backprop
            *   `tf.math.bincount` - TODO: confirm exception added
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Tue Jun 11 23:24:08 UTC 2024
    - 730.3K bytes
    - Viewed (0)
  5. tensorflow/compiler/mlir/tensorflow/transforms/tf_passes.td

      let description = [{
        Automatic space to depth transform is done by adding space to depth transform op after host input
        and applying space to depth transform for the first convolution and its backprop filter on TPU.
    
        For example, original program:
    
        ```mlir
        module {
          func @while_body {
            %input = "tf.IteratorGetNext"(...) {device = "/CPU:0"}: -> tensor<2x224x224x3xf32>
    Registered: Sun Jun 16 05:45:23 UTC 2024
    - Last Modified: Wed Jun 12 21:18:05 UTC 2024
    - 99.6K bytes
    - Viewed (0)
Back to top