Skip to main content

AppleCoreMLInferenceMode

Inference modes for Apple CoreML on supported devices.

enum AppleCoreMLInferenceMode {
CPU = 0,
GPU = 1,
ANE = 2,
}

Values

EnumValueDescription
CPU0Uses only CPU for inference computations. Most compatible but potentially slower option
GPU1Prioritizes GPU for inference computations. Better performance for parallel processing tasks
ANE2Automatic selection with Apple Neural Engine (ANE) prioritized. Best performance on supported devices with neural engine hardware