TensorFlow Mobile

If you’ve already trained and converted your own model for mobile, you can use the custom model library in order to manage your models on the edge. Currently we support TensorFlow Lite and TensorFlow Mobile for running models on Android devices 5.0 (Lollipop, SDK version 21) and higher.

Note

If you haven’t set up the SDK yet, make sure to go through those directions first. You’ll need to add the Core library to the app before using the specific feature or custom model libraries.


1. Add Fritz’s TFMobile Library

Edit your app/build.gradle file:

Allow libraries from our repository:

repositories {
    maven { url "https://raw.github.com/fritzlabs/fritz-repository/master" }
}

Add Fritz to your gradle dependencies and resync the project:

dependencies {
    implementation 'ai.fritz:core:1.3.2'
    implementation 'ai.fritz:custom-model-tfmobile:1.3.2'
}

2. Register the Fritz JobService in your Android Manifest

In order for Fritz to monitor and update your models, we’ll need to make sure the Internet permissions are enabled and that we have a JobService registered to listen for changes and handle over the air (OTA) updates.

To do this, open up your AndroidManifest.xml and add these lines:

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="ai.fritz.fritzsdkapp">
    /*
      ----------------------------------------------
         Fritz SDK internet permissions
      ----------------------------------------------
    */
    <uses-permission android:name="android.permission.INTERNET" />

    <application
        android:allowBackup="false"
        android:theme="@style/AppTheme">
        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>
        /*
          ------------------------------------
            Register FritzJob as a service
            and include the api key
          ------------------------------------
         */
        <meta-data android:name="fritz_api_key" android:value="api-key-12345" />
        <service
            android:name="ai.fritz.core.FritzJob"
            android:exported="true"
            android:permission="android.permission.BIND_JOB_SERVICE" />
         /*
          -----------------------------
                      END
          -----------------------------
         */
    </application>

</manifest>

3. Replace the TensorFlowInferenceInterface class with FritzInterpreter

FritzInterpreter allows you to manage your models remotely and, under the hood, provides the same methods as TensorFlowInferenceInterface.

Change:

Find:

TensorFlowInferenceInterface inferenceInterface = new TensorFlowInferenceInterface(
    assetManager, "file:///android_asset/digits.pb");

Change to:

ModelSettings settings = new ModelSettings.Builder()
    .modelId("model-id-abcde")
    .modelPath("file:///android_asset/digits.pb")
    .modelVersion(1)
    .build();

FritzTFMobileInterpreter fritzInterpreter = FritzTFMobileInterpreter.create(context, settings);

Note

The FritzTFMobileInterpreter class contains all the same methods as TensorFlowInferenceInterface.

// Changed from inferenceInterface.feed(...)
fritzInterpreter.feed(...)

// Changed from inferenceInterface.run(...)
fritzInterpreter.run(...)

// Changed from inferenceInterface.fetch(...)
fritzInterpreter.fetch(...)

4. Build and run your app

Test out each part of your app that uses the prediction, then look at the Fritz dashboard to see if data is showing up.