iOS

You can use the FritzVisionStyleModel to stylize images. Fritz provides a variety of options to configure predictions. These instructions will help you get Style Transfer running in your app in no time.

Note

If you haven’t set up the SDK yet, make sure to go through those directions first. You’ll need to add the Core library to the app before using the specific feature or custom model libraries.

1. Add the model to your project

Include the FritzVisionStyleModel in your Podfile. This will include all styles in your Fritz bundle.

pod 'Fritz/VisionStyleModel'

Make sure to install the added pod:

pod install

2. Define FritzVisionStyleModel

Choose which FritzVisionStyleModel you want to use. You can choose a specific style or access a list of all models. There should only be one instance of the model that is shared across all predictions:

Using a specific style:

import Fritz
lazy var styleModel = FritzVisionStyleModel.starryNight

Using all styles:

import Fritz
let styleModels = FritzVisionStyleModel.allModels()

Using a specific style:

@import Fritz;
FritzVisionStyleModel *styleModel = [FritzVisionStyleModel starryNight];

Using all styles:

@import Fritz;
NSArray *styleModels = [FritzVisionStyleModel allModels];

3. Create FritzImage

FritzImage supports different image formats.

  • Using a CMSampleBuffer

    If you are using a CMSampleBuffer from the built-in camera, first create the FritzImage instance:

    let image = FritzVisionImage(buffer: sampleBuffer)
    
    FritzVisionImage *visionImage = [[FritzVisionImage alloc] initWithBuffer: sampleBuffer];
    // or
    FritzVisionImage *visionImage = [[FritzVisionImage alloc] initWithImage: uiImage];
    

    The image orientation data needs to be properly set for predictions to work. Use FritzImageMetadata to customize orientation for an image. By default, if you specify FritzVisionImageMetadata the orientation will be .right:

    image.metadata = FritzVisionImageMetadata()
    image.metadata?.orientation = .left
    
    // Add metdata
    visionImage.metadata = [FritzVisionImageMetadata new];
    visionImage.metadata.orientation = FritzImageOrientationLeft;
    

    Note

    Data passed in from the camera will generally need the orientation set. When using a CMSampleBuffer to create a FritzImage the orientation will change depending on which camera and device orientation you are using.

    When using the back camera in the portrait Device Orientation, the orientation should be .right (the default if you specify FritzVisionImageMetadata on the image). When using the front facing camera in portrait Device Orientation, the orientation should be .left.

    You can initialize the FritzImageOrientation with the AVCaptureConnection to infer orientation (if the Device Orientation is portrait):

    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        ...
        image.metadata = FritzVisionImageMetadata()
        image.metadata?.orientation = FritzImageOrientation(connection)
        ...
    }
    
  • Using a UIImage

    If you are using a UIImage, create the FritzVision instance:

    let image = FritzVisionImage(image: uiImage)
    

    The image orientation data needs to be properly set for predictions to work. Use FritzImageMetadata to customize orientation for an image:

    image.metadata = FritzVisionImageMetadata()
    image.metadata?.orientation = .right
    

    Note

    UIImage can have associated UIImageOrientation data (for example when capturing a photo from the camera). To make sure the model is correctly handling the orientation data, initialize the FritzImageOrientation with the image’s image orientation:

    image.metadata?.orientation = FritzImageOrientation(image.imageOrientation)
    

4. Stylize your Image

Now, use the styleModel instance you created earlier to stylize your image:

styleModel.predict(image) { stylizedBuffer, error in
  guard error == nil, let stylizedBuffer = stylizedBuffer else { return }

  // Code to work with stylized image here.
  // If you're not sure how to use the output image, check out the public heartbeat
  // project.
}
[styleModel predict:image completion:^(NSArray *objects, NSError *error) {
  // Code to work with stylized image here.
  // If you're not sure how to use the output image, check out the public heartbeat
  // project.
}];

For a full tutorial of style transfer, take a look at this post on Heartbeat.