- Sort Score
- Result 10 results
- Languages All
Results 1 - 2 of 2 for Motivation (0.17 sec)
-
RELEASE.md
* Add `UnifiedGRU` as the new GRU implementation for tf2.0. Change the default recurrent activation function for GRU from `hard_sigmoid` to `sigmoid`, and `reset_after` to True in 2.0. Historically recurrent activation is `hard_sigmoid` since it is fast than 'sigmoid'. With new unified backend between CPU and GPU mode, since the CuDNN kernel is using sigmoid, we change
Registered: Sun Jun 16 05:45:23 UTC 2024 - Last Modified: Tue Jun 11 23:24:08 UTC 2024 - 730.3K bytes - Viewed (0) -
tensorflow/compiler/mlir/lite/ir/tfl_ops.td
QuantizableResult, PredOpTrait<"input and output must have same element type", TFL_TCresVTEtIsSameAsOp<0, 0>>]> { let summary = "Hardswish activation function."; let description = [{ Computes hard-swish activation function f(x) -> (x * relu6(x+3))/6 element-wise. }]; let arguments = (ins TFL_TensorOf<[F32, QUI8, QI8]>:$input);
Registered: Sun Jun 16 05:45:23 UTC 2024 - Last Modified: Thu Jun 06 19:09:08 UTC 2024 - 186K bytes - Viewed (0)