Uses of Package
org.anchoranalysis.inference.concurrency
Packages that use org.anchoranalysis.inference.concurrency
Package
Description
Specifying how many CPUs and GPUs can be allocated for some purpose.
-
Classes in org.anchoranalysis.inference.concurrency used by org.anchoranalysis.inference.concurrencyClassDescriptionHow many allocated CPUs and CPUs can be used concurrently for inference.An instance of model that can be used concurrently for inference.This exception indicates that an error occurred when performing inference from a model concurrently.When creating a model to be used for inference fails.Creates a model to use in the pool.Wraps an element of type
Tto ensure priority is given when the flaggpu==true.