TensorFlow Lite

If you’ve already trained and converted your own model for mobile, you can use the custom model library in order to manage your models on the edge. Currently we support TensorFlow Lite and TensorFlow Mobile for running models on Android devices 5.0 (Lollipop, SDK version 21) and higher.

Note

If you haven’t set up the SDK yet, make sure to go through those directions first. You’ll need to add the Core library to the app before using the specific feature or custom model libraries.


1. Add Fritz’s TFLite Library

Edit your app/build.gradle file:

Allow libraries from our repository:

repositories {
    maven { url "https://raw.github.com/fritzlabs/fritz-repository/master" }
}

Add Fritz to your gradle dependencies and resync the project:

dependencies {
    implementation 'ai.fritz:core:2.0.0'
    implementation 'ai.fritz:custom-model-tflite:2.0.0'
}

2. Register the FritzCustomModelService in your Android Manifest

In order to monitor and update your models, we’ll need to register the JobService to listen for changes and handle over the air (OTA) updates.

To do this, open up your AndroidManifest.xml and add these lines:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="ai.fritz.fritzsdkapp">
    /*
      ----------------------------------------------
         Fritz SDK internet permissions
      ----------------------------------------------
    */
    <uses-permission android:name="android.permission.INTERNET" />

    <application>
        ...
        /*
          ------------------------------------
            Register FritzCustomModelService
          ------------------------------------
         */
        <service
            android:name="ai.fritz.core.FritzCustomModelService"
            android:exported="true"
            android:permission="android.permission.BIND_JOB_SERVICE" />
         /*
          -----------------------------
                      END
          -----------------------------
         */
    </application>
</manifest>

3. Add your custom model to your Android app’s asset folder

Make sure you’ve added your TensorFlow Lite model (.tflite) to your project’s assets folder.

4. Download or copy your Custom Model class to your app.

FritzInterpreter allows you to manage your models remotely and, under the hood, provides the same methods as TFLite’s Interpreter.

Let’s pretend that you’ve stored a mnist.tflite model in the assets folder. You’ll want to create a custom model like this (you can download this class from the webapp under Custom Model > Your Model > SDK Instructions):

public class MnistCustomModel extends CustomModel {
    private static final String MODEL_PATH = "file:///android_asset/mnist.tflite";
    private static final String MODEL_ID = "<your model id>";
    private static final int MODEL_VERSION = 1;

    public MnistCustomModel(Context context) {
        super(MODEL_PATH, MODEL_ID, MODEL_VERSION);
    }
}

5. Create a TFL interpreter and run prediction.

From here, you can create a TensorFlow Lite interpreter and run predictions with the managed model.**

// Create an interpreter
FritzTFLiteInterpreter tflite = FritzTFLiteInterpreter.create(new MnistCustomModel());

// Run prediction with input / output buffers
tflite.run(inputBuffer, outputBuffer);

6. Track the performance of your model

Run your app with the appropriate code in order to track model performance. Afterwards, you can see the measurements appear in the webapp under Dashboard:

Average time / prediction (ms) chart