All Classes and Interfaces
Class
Description
How many allocated CPUs and CPUs can be used concurrently for inference.
An instance of model that can be used concurrently for inference.
This exception indicates that an error occurred when performing inference from a model
concurrently.
Keeps concurrent copies of a model to be used by different threads.
When creating a model to be used for inference fails.
Creates a model to use in the pool.
A model used for inference.
Wraps an element of type
T to ensure priority is given when the flag gpu==true.