public class

FritzTFLiteInterpreter

extends Object
java.lang.Object
   ↳ ai.fritz.customtflite.FritzTFLiteInterpreter

Class Overview

A TensorFlow Lite interpreter to manage and track model inference with Fritz.

The FritzTFLite interpreter wraps around the TensorFlow Lite Interpreter class. All methods should be modeled after those in TensorFlow Lite's Interpreter class.

Summary

Public Methods
void close()
Release resources associated with the Interpreter.
static FritzTFLiteInterpreter create(CustomModel settings)
Creates a FritzTFLiteInterpreter to use in the app.
int getInputIndex(String opName)
Gets index of an input given the op name of the input.
long getLastNativeInferenceDurationNanoseconds()
Get the inference timing in nanoseconds.
int getOutputIndex(String opName)
Gets index of an output given the op name of the output.
void resizeInput(int idx, int[] dims)
Resizes idx-th input of the native model to the given dims.
void run(Object input, Object output)
Run model inference on the input and output methods.
void runForMultipleInputsOutputs(Object[] inputs, Map<Integer, Object> outputs)
Runs model inference for multiple inputs / outputs.
void setUseNNAPI(boolean useNNAPI)
Turns on/off Android NNAPI for hardware acceleration when it is available.
[Expand]
Inherited Methods
From class java.lang.Object

Public Methods

public void close ()

Release resources associated with the Interpreter.

public static FritzTFLiteInterpreter create (CustomModel settings)

Creates a FritzTFLiteInterpreter to use in the app.

Loads the model according to the path specified in CustomModel.@return

Parameters
settings a CustomModel object to load the included model in the app.
Throws
IOException

public int getInputIndex (String opName)

Gets index of an input given the op name of the input.

Parameters
opName operation name

public long getLastNativeInferenceDurationNanoseconds ()

Get the inference timing in nanoseconds.

public int getOutputIndex (String opName)

Gets index of an output given the op name of the output.

Parameters
opName operation name

public void resizeInput (int idx, int[] dims)

Resizes idx-th input of the native model to the given dims.

Parameters
idx index
dims dimensions

public void run (Object input, Object output)

Run model inference on the input and output methods.

The interpreter will record metrics on model execution.

public void runForMultipleInputsOutputs (Object[] inputs, Map<Integer, Object> outputs)

Runs model inference for multiple inputs / outputs.

The interpreter will record metrics on model execution.

public void setUseNNAPI (boolean useNNAPI)

Turns on/off Android NNAPI for hardware acceleration when it is available.

Parameters
useNNAPI true/false should use NNAPI