Fritz AI allows you to manage your models across live devices. You can provide new and updated versions of your model and automatically deploy them to users. You don't need to create a new version of your app, or go through an app store release process.
If you've already converted your custom model for mobile, you can use Fritz AI to manage models on device. By using the custom model library, you'll be able to monitor your model's usage and performance and update your models over-the-air.
What you can do with managed Custom Models
- Monitor model performance on different devices and operating systems
- Iterate on your models and deploy over-the-air updates with Model Versioning.
- Add Tags and Metadata to each model and access them with the SDK.
- Package and distribute different models to edge devices with tagged attributes.
Getting started with a Custom Model
- In your project, create your custom model by uploading your model in one of the supported formats.
- After you've uploaded the model, you'll want to take note of 2 things (Both of these can be found in the model details page):
- Your API key which is generated as part of setting up the SDK.
- Your model id which is used to track the specific model.
- Finally, you'll need to add the specific library for the ML framework you're using on mobile.
Distributing models over-the-air (OTA)
Model Versioning - With the SDK, you can update on-device models over-the-air and without a new app store release. This allows you to iterate faster and deliver higher quality models to improve your features.
Tag-based distribution - Add tags and metadata to on-device machine learning models. Models can be queried by tags and loaded dynamically via the SDK giving you more control over distribution and usage. Deliver models to users based on hardware, location, software environment or any other attribute.
Supported Mobile ML Frameworks
Currently we support the following mobile machine learning frameworks:
- Core ML
- TensorFlow Lite