FritzVisionSegmentationResult

@objc(FritzVisionSegmentationResult)
public class FritzVisionSegmentationResult : NSObject
  • Height of model output array.

    Declaration

    Swift

    @objc
    public let height: Int
  • Width of model output array.

    Declaration

    Swift

    @objc
    public let width: Int
  • Model classes.

    Declaration

    Swift

    @objc
    public let classes: [ModelSegmentationClass]
  • Raw MLMultiArray result from prediction.

    Declaration

    Swift

    @objc
    public let predictionResult: MLMultiArray
  • Create 2D-Array same size as the model output with each point representing most likely class.

    Declaration

    Swift

    @objc(getMaxIndices:)
    public func getMaxIndices(minThreshold: Double = 0.0) -> [Int32]

    Parameters

    minThreshold

    Only include classes that have a probability greater than the minThreshold.

  • Generate UIImage mask from most likely class at each pixel.

    The generated image size will fit the original image passed into prediction, applying rotation. If the image was center cropped, will return an image that covers the cropped image.

    Declaration

    Swift

    @objc(toImageMask:alpha:)
    public func toImageMask(minThreshold: Double = 0.0, alpha: UInt8 = 255) -> UIImage?

    Parameters

    minThreshold

    Minimum threshold value needed to count. By default zero. You can set this property to filter out classes that may be the most likely but still have a lower probability.

    alpha

    Alpha value of the color (0-255) for detected classes. By default completely opaque.

  • Generate UIImage mask of given class, filtering out values below threshold.

    The generated image size will fit the original image passed into prediction, applying rotation. If the image was center cropped, will return an image that covers the cropped image.

    Declaration

    Swift

    @objc(toImageMask:threshold:alpha:minThresholdAccepted:)
    public func toImageMask(of segmentClass: ModelSegmentationClass, threshold: Double = 0.5, alpha: UInt8 = 255, minThresholdAccepted: Double = 0.5) -> UIImage?

    Parameters

    segmentClass

    Class to mask.

    threshold

    Probability to filter. Any probabilities below this value will be filtered out.

    alpha

    Alpha value of the color (0-255) for detected classes.

    minThresholdAccepted

    Any confidence score below this value will have an alpha of 0. Class confidence scores between minThresholdAccepted and threshold will retain their original value.