If you’ve already trained and converted your own model for mobile, you can use the custom model library in order to manage your models on the edge. Currently we support TensorFlow Lite and TensorFlow Mobile for running models on Android devices 5.0 (Lollipop, SDK version 21) and higher.
If you haven’t set up the SDK yet, make sure to go through those directions first. You’ll need to add the Core library to the app before using the specific feature or custom model libraries.
1. Add Fritz’s TFLite Library
Edit your app/build.gradle file:
2. Register the FritzCustomModelService in your Android Manifest
In order to monitor and update your models, we’ll need to register the JobService to listen for changes and handle over the air (OTA) updates.
To do this, open up your AndroidManifest.xml and add these lines:
3. Add your custom model to your Android app’s asset folder
Make sure you’ve added your TensorFlow Lite model (.tflite) to your project’s assets folder.
4. Download or copy your Custom Model class to your app.
FritzInterpreter allows you to manage your models remotely and, under the hood, provides the same methods as TFLite’s Interpreter.
Let’s pretend that you’ve stored a
mnist.tflitemodel in the assets folder. You’ll want to create a custom model like this (you can download this class from the webapp under Custom Model > Your Model > SDK Instructions):
5. Create a TFL interpreter and run prediction.
From here, you can create a TensorFlow Lite interpreter and run predictions with the managed model.**// Create an interpreter FritzTFLiteInterpreter tflite = FritzTFLiteInterpreter.create(new MnistCustomModel()); // Run prediction with input / output buffers tflite.run(inputBuffer, outputBuffer);
6. Track the performance of your model
Run your app with the appropriate code in order to track model performance. Afterwards, you can see the measurements appear in the webapp under Dashboard: