Custom TensorFlow Lite Models

Currently we support TensorFlow Lite for devices running Android 5.0 (Lollipop, SDK version 21) and higher.

note

If you haven't set up the SDK yet, make sure to go through those directions first. You'll need to add the Core library to the app before using the specific feature API or custom model. Follow iOS setup or Android setup directions.

Register the FritzCustomModelService in your Android Manifest

Creating an interpreter for your model

You can either include your model with the app or load the model when the app runs (recommended to reduce the size of your APK).

To find your model id, go to Your Project > Project Settings > Your App > Show API Key in the webapp.

Include your model on-device:

  • Add your TensorFlow Lite model (.tflite) to your project's assets folder.
  • Create a new FritzOnDeviceModel class linking to your included model
FritzOnDeviceModel onDeviceModel = new FritzOnDeviceModel("file:///android_asset/mnist.tflite", "<your model id>", 1);
FritzTFLiteInterpreter interpreter = new FritzTFLiteInterpreter(onDeviceModel);

You may also include a separate class (you can download this class from the webapp under Custom Model > Your Model > SDK Instructions):

public class MnistCustomModel extends FritzOnDeviceModel {
private static final String MODEL_PATH = "file:///android_asset/mnist.tflite";
private static final String MODEL_ID = "<your model id>";
private static final int MODEL_VERSION = 1;
public MnistCustomModel(Context context) {
super(MODEL_PATH, MODEL_ID, MODEL_VERSION);
}
}
FritzTFLiteInterpreter interpreter = new FritzTFLiteInterpreter(new MnistCustomModel());

Download your model at runtime:

Only available on Growth plans

For more information on plans and pricing, visit our website.

Create a FritzManagedModel with the model id and then use this to call loadModel to load an on-device model.

FritzTFLiteInterpreter tflite;
FritzManagedModel managedModel = new FritzManagedModel("your model id");
FritzModelManager modelManager = new FritzModelManager(managedModel);
modelManager.loadModel(new ModelReadyListener() {
@Override
public void onModelReady(FritzOnDeviceModel onDeviceModel) {
tflite = new FritzTFLiteInterpreter(onDeviceModel);
Log.d(TAG, "Interpreter is now ready to use");
}
});
note

To download the model using WIFI only, please specify this by adding the useWifi parameter to loadModel (by default, the model can be downloaded over any network connection).

modelManager.loadModel(new ModelReadyListener() {
@Override
public void onModelReady(FritzOnDeviceModel onDeviceModel) {
tflite = new FritzTFLiteInterpreter(onDeviceModel);
Log.d(TAG, "Interpreter is now ready to use");
}
}, true);