Currently we support TensorFlow Lite for devices running Android 5.0 (Lollipop, SDK version 21) and higher.
Register the FritzCustomModelService in your Android Manifest
Creating an interpreter for your model
You can either include your model with the app or load the model when the app runs (recommended to reduce the size of your APK).
To find your model id, go to Your Project > Project Settings > Your App > Show API Key in the webapp.
Include your model on-device:
- Add your TensorFlow Lite model (.tflite) to your project's assets folder.
- Create a new FritzOnDeviceModel class linking to your included model
You may also include a separate class (you can download this class from the webapp under Custom Model > Your Model > SDK Instructions):
Download your model at runtime:
Only available on Growth plans
For more information on plans and pricing, visit our website.
FritzManagedModel with the model id and then use this to call
loadModel to load an on-device model.
To download the model using WIFI only, please specify this by adding the
useWifi parameter to
loadModel (by default, the model can be downloaded over any network connection).