TensorFlow Lite

If you’ve already trained and converted your own model for mobile, you can use the custom model library in order to manage your models on the edge. Currently we support TensorFlow Lite and TensorFlow Mobile for running models on Android devices 5.0 (Lollipop, SDK version 21) and higher.

Note

If you haven’t set up the SDK yet, make sure to go through those directions first. You’ll need to add the Core library to the app before using the specific feature or custom model libraries.


1. Add Fritz’s TFLite Library

Edit your app/build.gradle file:

Allow libraries from our repository:

repositories {
    maven { url "https://raw.github.com/fritzlabs/fritz-repository/master" }
}

Add Fritz to your gradle dependencies and resync the project:

dependencies {
    implementation 'ai.fritz:core:1.3.2'
    implementation 'ai.fritz:custom-model-tflite:1.3.2'
}

2. Register the JobService in your Android Manifest

In order to monitor and update your models, we’ll need to register the JobService to listen for changes and handle over the air (OTA) updates.

To do this, open up your AndroidManifest.xml and add these lines:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="ai.fritz.fritzsdkapp">
    /*
      ----------------------------------------------
         Fritz SDK internet permissions
      ----------------------------------------------
    */
    <uses-permission android:name="android.permission.INTERNET" />

    <application
        android:allowBackup="false"
        android:theme="@style/AppTheme">
        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>
        /*
          ------------------------------------
            Register FritzJob as a service
            and include the api key
          ------------------------------------
         */
        <meta-data android:name="fritz_api_key" android:value="api-key-12345" />
        <service
            android:name="ai.fritz.core.FritzJob"
            android:exported="true"
            android:permission="android.permission.BIND_JOB_SERVICE" />
         /*
          -----------------------------
                      END
          -----------------------------
         */
    </application>
</manifest>

3. Replace the TensorFlow Lite Interpreter with FritzInterpreter

FritzInterpreter allows you to manage your models remotely and, under the hood, provides the same methods as TFLite’s Interpreter.

Let’s assume that you’ve stored your model.tflite file in the app assets folder.

Change:

Interpreter tflite = new Interpreter(loadModelFile("digits.tflite"));

To:

ModelSettings settings = new ModelSettings.Builder()
    .modelId("model-id-abcde")
    .modelPath("file:///android_asset/digits.pb")
    .modelVersion(1)
    .build();

FritzTFLiteInterpreter tflite = FritzTFLiteInterpreter.create(context, settings);

4. Build and run your app

Test out each part of your app that uses the prediction, then look at the Fritz dashboard to see if data is showing up.