TensorFlow Mobile

If you’ve already trained and converted your own model for mobile, you can use the custom model library in order to manage your models on the edge. Currently we support TensorFlow Lite and TensorFlow Mobile for running models on Android devices 5.0 (Lollipop, SDK version 21) and higher.

Note

If you haven’t set up the SDK yet, make sure to go through those directions first. You’ll need to add the Core library to the app before using the specific feature or custom model libraries.


1. Add Fritz’s TFMobile Library

Edit your app/build.gradle file:

Allow libraries from our repository:

repositories {
    maven { url "https://raw.github.com/fritzlabs/fritz-repository/master" }
}

Add Fritz to your gradle dependencies and resync the project:

dependencies {
    implementation 'ai.fritz:core:2.0.0'
    implementation 'ai.fritz:custom-model-tfmobile:2.0.0'
}

2. Register the FritzCustomModelService in your Android Manifest

In order to monitor and update your models, we’ll need to register the JobService to listen for changes and handle over the air (OTA) updates.

To do this, open up your AndroidManifest.xml and add these lines:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="ai.fritz.fritzsdkapp">
    /*
      ----------------------------------------------
         Fritz SDK internet permissions
      ----------------------------------------------
    */
    <uses-permission android:name="android.permission.INTERNET" />

    <application>
        ...
        /*
          ------------------------------------
            Register FritzCustomModelService
          ------------------------------------
         */
        <service
            android:name="ai.fritz.core.FritzCustomModelService"
            android:exported="true"
            android:permission="android.permission.BIND_JOB_SERVICE" />
         /*
          -----------------------------
                      END
          -----------------------------
         */
    </application>
</manifest>

3. Download or copy your Custom Model class to your app.

Let’s pretend that you’ve stored a mnist.tflite model in the assets folder. You’ll want to create a custom model like this (you can download this class from the webapp under Custom Model > Your Model > SDK Instructions):

public class MnistCustomModel extends CustomModel {
    private static final String MODEL_PATH = "file:///android_asset/mnist.tflite";
    private static final String MODEL_ID = "<your model id>";
    private static final int MODEL_VERSION = 1;

    public MnistCustomModel(Context context) {
        super(MODEL_PATH, MODEL_ID, MODEL_VERSION);
    }
}

4. Create a TFM interpreter and run prediction.

From here, you can create a TensorFlow Mobile interpreter and run prediction with the managed model.**

// Create an interpreter
FritzTFMobileInterpreter interpreter = FritzTFMobileInterpreter.create(new MnistCustomModel());

// Run prediction with input / output buffers
interpreter.feed(inputName, inputValues, dimenSize, inputSize, inputSize, numChannels);

// Run inference.
String[] outputNames = new String[]{outputName};
interpreter.run(outputNames);

// Copy the output into the array.
interpreter.fetch(outputName, outputValues);

Note

The FritzTFMobileInterpreter class contains all the same methods as TensorFlowInferenceInterface.

// Changed from inferenceInterface.feed(...)
fritzInterpreter.feed(...)

// Changed from inferenceInterface.run(...)
fritzInterpreter.run(...)

// Changed from inferenceInterface.fetch(...)
fritzInterpreter.fetch(...)

5. Track the performance of your model

Run your app in order to track model performance. Afterwards, you can see the measurements appear in the webapp under Dashboard:

Average time / prediction (ms) chart