- Sort Score
- Result 10 results
- Languages All
Results 61 - 70 of 118 for necessarily (0.2 sec)
-
src/runtime/malloc.go
// case, the memory reserved in (s *pageAlloc).init for chunks // is causing important slowdowns. // // On other platforms, the user address space is contiguous // and starts at 0, so no offset is necessary. arenaBaseOffset = 0xffff800000000000*goarch.IsAmd64 + 0x0a00000000000000*goos.IsAix // A typed version of this constant that will make it into DWARF (for viewcore). arenaBaseOffsetUintptr = uintptr(arenaBaseOffset)
Registered: Wed Jun 12 16:32:35 UTC 2024 - Last Modified: Wed May 29 17:58:53 UTC 2024 - 59.6K bytes - Viewed (0) -
tensorflow/c/c_api.h
// function. This deallocation function will point to client code // for tensors populated by the client. So the client can do things // like shadowing a numpy array. // * We do not provide TF_OK since it is not strictly necessary and we // are not optimizing for convenience. // * We make assumption that one session has one graph. This should be // fine since we have the ability to run sub-graphs.
Registered: Sun Jun 16 05:45:23 UTC 2024 - Last Modified: Thu Oct 26 21:08:15 UTC 2023 - 82.3K bytes - Viewed (0) -
tensorflow/compiler/mlir/tensorflow/transforms/sparsecore/embedding_pipelining.cc
// Embedding API. // // Since we're inserting a replication boundary around the backward pass // function, we'll also need to make sure TPUReplicatedInputOp and // TPUReplicatedOutputOp ops are inserted as necessary. // First, walk the Ops dependencies. GatherOpsForExtraction(&backward_pass_ops, merged_set, /*predecessors=*/false, /*successors=*/true);
Registered: Sun Jun 16 05:45:23 UTC 2024 - Last Modified: Thu Apr 25 16:01:03 UTC 2024 - 92.9K bytes - Viewed (0) -
src/runtime/asm_amd64.s
// clobbered by wbBufFlush and were not saved by the caller. // It is possible for wbBufFlush to clobber other registers // (e.g., SSE registers), but the compiler takes care of saving // those in the caller if necessary. This strikes a balance // with registers that are likely to be used. // // We don't have type information for these, but all code under // here is NOSPLIT, so nothing will observe these. //
Registered: Wed Jun 12 16:32:35 UTC 2024 - Last Modified: Sat May 11 20:38:24 UTC 2024 - 60.4K bytes - Viewed (0) -
cmd/xl-storage.go
return s, errFaultyDisk } return s, err } s.formatData = formatData s.formatFileInfo = formatFi s.formatFile = pathJoin(s.drivePath, minioMetaBucket, formatConfigFile) // Create all necessary bucket folders if possible. if err = makeFormatErasureMetaVolumes(s); err != nil { return s, err } if len(s.formatData) > 0 { format := &formatErasureV3{}
Registered: Sun Jun 16 00:44:34 UTC 2024 - Last Modified: Mon Jun 10 15:51:27 UTC 2024 - 85.3K bytes - Viewed (0) -
tensorflow/compiler/jit/encapsulate_subgraphs_pass.cc
// would be possible to add a switch statement over data_type to create a value // for the constant, but that would entail maintaining the logic as new types // are added, and is not necessary.) If the node being replaced was within a // control flow frame, adds appropriate Enter nodes so that the use of the Const // is well-formed. Node* AddDummyShapedNode(const Node* src_node, int src_port,
Registered: Sun Jun 16 05:45:23 UTC 2024 - Last Modified: Thu Feb 22 08:47:20 UTC 2024 - 51K bytes - Viewed (0) -
src/cmd/internal/obj/riscv/obj.go
// Mark the stack bound check and morestack call async nonpreemptible. // If we get preempted here, when resumed the preemption request is // cleared, but we'll still call morestack, which will double the stack // unnecessarily. See issue #35470. p = ctxt.StartUnsafePoint(p, newprog) var to_done, to_more *obj.Prog if framesize <= abi.StackSmall { // small stack // // if SP > stackguard { goto done }
Registered: Wed Jun 12 16:32:35 UTC 2024 - Last Modified: Sun Apr 07 03:32:27 UTC 2024 - 77K bytes - Viewed (0) -
src/runtime/mheap.go
// it during this function. Currently to ensure that we enforce // that the function is run on the system stack, because that's // the only place it is used now. In the future, this requirement // may be relaxed if its use is necessary elsewhere. // //go:systemstack func (h *mheap) tryAllocMSpan() *mspan { pp := getg().m.p.ptr() // If we don't have a p or the cache is empty, we can't do // anything here.
Registered: Wed Jun 12 16:32:35 UTC 2024 - Last Modified: Wed May 22 22:31:00 UTC 2024 - 78K bytes - Viewed (0) -
guava/src/com/google/common/collect/ImmutableSortedMap.java
@Override public Builder<K, V> put(K key, V value) { super.put(key, value); return this; } /** * Adds the given {@code entry} to the map, making it immutable if necessary. Duplicate keys, * according to the comparator (which might be the keys' natural order), are not allowed, and * will cause {@link #build} to fail. * * @since 11.0 */
Registered: Wed Jun 12 16:38:11 UTC 2024 - Last Modified: Thu Feb 22 21:19:52 UTC 2024 - 50.3K bytes - Viewed (0) -
platforms/documentation/docs/src/docs/userguide/releases/upgrading/upgrading_version_8.adoc
Now that these types implement link:{javadocPath}/org/gradle/api/Named.html[`Named`], these classes are no longer necessary and have been deprecated. They will be removed in Gradle 9.0. Use link:{javadocPath}/org/gradle/api/Named.Namer.html#INSTANCE[`Named.Namer.INSTANCE`] instead.
Registered: Wed Jun 12 18:38:38 UTC 2024 - Last Modified: Fri Jun 07 17:01:07 UTC 2024 - 90.7K bytes - Viewed (0)