public class

FritzTFMobileInterpreter

extends Object
java.lang.Object
   ↳ ai.fritz.customtfmobile.FritzTFMobileInterpreter

Class Overview

A TensorFlow Mobile interpreter to manage and track model inference with Fritz.

The FritzTFMobile interpreter wraps around TensorFlowInferenceInterface.

Summary

Nested Classes
interface FritzTFMobileInterpreter.OTAUpdateListener Method to run before reloading the interface from an OTA update. 
Public Methods
void close()
static FritzTFMobileInterpreter create(CustomModel createdCustomModel, FritzTFMobileInterpreter.OTAUpdateListener listener)
Creates a FritzTFMobileInterpreter to use in the app.
static FritzTFMobileInterpreter create(CustomModel settings)
Creates a FritzTFMobileInterpreter to use in the app.
void feed(String inputName, float[] src, long... dims)
void feed(String inputName, int[] src, long... dims)
void feed(String inputName, FloatBuffer src, long... dims)
void feed(String inputName, DoubleBuffer src, long... dims)
void feed(String inputName, ByteBuffer src, long... dims)
void feed(String inputName, LongBuffer src, long... dims)
void feed(String inputName, IntBuffer src, long... dims)
void feed(String inputName, long[] src, long... dims)
void feed(String inputName, byte[] src, long... dims)
void feed(String inputName, boolean[] src, long... dims)
void feed(String inputName, double[] src, long... dims)
void feedString(String inputName, byte[] src)
void feedString(String inputName, byte[][] src)
void fetch(String outputName, float[] dst)
void fetch(String outputName, LongBuffer dst)
void fetch(String outputName, ByteBuffer dst)
void fetch(String outputName, IntBuffer dst)
void fetch(String outputName, byte[] dst)
void fetch(String outputName, double[] dst)
void fetch(String outputName, FloatBuffer dst)
void fetch(String outputName, long[] dst)
void fetch(String outputName, int[] dst)
void fetch(String outputName, DoubleBuffer dst)
TensorFlowInferenceInterface getInferenceInterface()
Get the underlying inference interface if you it.
String getStatString()
Graph graph()
Operation graphOperation(String operationName)
void run(String[] outputNames)
Run model inference.
void run(String[] outputNames, boolean enableStats, String[] targetNodeNames)
Run model inference.
void run(String[] outputNames, boolean logStats)
Run model inference.
[Expand]
Inherited Methods
From class java.lang.Object

Public Methods

public void close ()

public static FritzTFMobileInterpreter create (CustomModel createdCustomModel, FritzTFMobileInterpreter.OTAUpdateListener listener)

Creates a FritzTFMobileInterpreter to use in the app.

Parameters
createdCustomModel the custom model used to load the model from app storage.
listener the OTA listener
Returns
  • FritzTFMobileInterpreter the interpreter
Throws
IOException if the model can't be loaded properly.

public static FritzTFMobileInterpreter create (CustomModel settings)

Creates a FritzTFMobileInterpreter to use in the app.

Uses an empty OTAUpdateListener by default.

Parameters
settings the custom model used to load the model from app storage.
Returns
  • FritzTFMobileInterpreter the interpreter
Throws
IOException if the model can't be loaded properly.

public void feed (String inputName, float[] src, long... dims)

public void feed (String inputName, int[] src, long... dims)

public void feed (String inputName, FloatBuffer src, long... dims)

public void feed (String inputName, DoubleBuffer src, long... dims)

public void feed (String inputName, ByteBuffer src, long... dims)

public void feed (String inputName, LongBuffer src, long... dims)

public void feed (String inputName, IntBuffer src, long... dims)

public void feed (String inputName, long[] src, long... dims)

public void feed (String inputName, byte[] src, long... dims)

public void feed (String inputName, boolean[] src, long... dims)

public void feed (String inputName, double[] src, long... dims)

public void feedString (String inputName, byte[] src)

public void feedString (String inputName, byte[][] src)

public void fetch (String outputName, float[] dst)

public void fetch (String outputName, LongBuffer dst)

public void fetch (String outputName, ByteBuffer dst)

public void fetch (String outputName, IntBuffer dst)

public void fetch (String outputName, byte[] dst)

public void fetch (String outputName, double[] dst)

public void fetch (String outputName, FloatBuffer dst)

public void fetch (String outputName, long[] dst)

public void fetch (String outputName, int[] dst)

public void fetch (String outputName, DoubleBuffer dst)

public TensorFlowInferenceInterface getInferenceInterface ()

Get the underlying inference interface if you it.

Returns
  • the wrapped inferenceInterface.

public String getStatString ()

public Graph graph ()

public Operation graphOperation (String operationName)

public void run (String[] outputNames)

Run model inference.

The interpreter will record metrics on model execution.

public void run (String[] outputNames, boolean enableStats, String[] targetNodeNames)

Run model inference.

The interpreter will record metrics on model execution.

public void run (String[] outputNames, boolean logStats)

Run model inference.

The interpreter will record metrics on model execution.