Table of Contents

Method SetInference_UseCoreML

Namespace
MaaFramework.Binding
Assembly
MaaFramework.Binding.dll

SetInference_UseCoreML(IMaaOption<ResourceOption>?, int)

Sets inference execution provider and device.

public static bool SetInference_UseCoreML(this IMaaOption<ResourceOption>? opt, int coreMLFlags = -1)

Parameters

opt IMaaOption<ResourceOption>?

The option.

coreMLFlags int

The CoreML flags.

Returns

bool

true if the option was set successfully; otherwise, false.

Remarks

Reference to COREMLFlags.

But you need to pay attention to the onnxruntime version we use, the latest flag may not be supported.

Exceptions

ArgumentNullException

SetInference_UseCoreML(IMaaOption<ResourceOption>?, InferenceCoreMLFlags)

Sets inference execution provider and device.

public static bool SetInference_UseCoreML(this IMaaOption<ResourceOption>? opt, InferenceCoreMLFlags coreMLFlags = (InferenceCoreMLFlags)-1)

Parameters

opt IMaaOption<ResourceOption>?

The option.

coreMLFlags InferenceCoreMLFlags

The CoreML flags.

Returns

bool

true if the option was set successfully; otherwise, false.

Remarks

Reference to COREMLFlags.

But you need to pay attention to the onnxruntime version we use, the latest flag may not be supported.

Exceptions

ArgumentNullException