SPPAS 4.20

Module sppas.src.imgdata

Class HaarCascadeDetector

Description

Detect objects in an image with Haar Cascade Classifier.

The HaarCascadeClassifier, when used to detect objects, is returning a

list of detections, with weights instead of confidence scores. This class

converts weights into scores ranging [0.998, min_ratio] with a modified

version of the Unity-based normalization method.

This classifier already delete overlapped detections and allows to fix

a threshold size to filter too small objects.

Constructor

View Source
def __init__(self):
    super(HaarCascadeDetector, self).__init__()
    self._extension = '.xml'

Public functions

normalize_weights

Return the normalized list of values.

Use the Unity-based normalization, slightly adapted.

Parameters
  • dataset: (list) List of float weight values
Returns
  • list of confidence scores
View Source
def normalize_weights(self, dataset):
    """Return the normalized list of values.

        Use the Unity-based normalization, slightly adapted.

        :param dataset: (list) List of float weight values
        :returns: list of confidence scores

        """
    a = self.get_min_score()
    b = 0.998
    coeff = b - a
    norm_list = list()
    if isinstance(dataset, list):
        min_value = min(dataset) * b
        max_value = max(dataset) * 1.01
        for value in dataset:
            tmp = a + (value - min_value) * coeff / (max_value - min_value)
            norm_list.append(min(1.0, max(0.0, tmp)))
    return norm_list

Private functions

_set_detector

Initialize the detector with the given file.

Parameters
  • model: (str) Filename of the XML Haar Cascade file
Raises

Exception

View Source
def _set_detector(self, model):
    """Initialize the detector with the given file.

        :param model: (str) Filename of the XML Haar Cascade file
        :raises: Exception

        """
    try:
        self._detector = cv2.CascadeClassifier(model)
    except cv2.error as e:
        logging.error('Loading the XML Haar Cascade model failed.')
        raise sppasError(str(e))
_detection

Determine the coordinates of the detected objects.

Parameters
  • image: (sppasImage or numpy.ndarray)
View Source
def _detection(self, image):
    """Determine the coordinates of the detected objects.

        :param image: (sppasImage or numpy.ndarray)

        """
    detections = self.__haar_detections(image)
    if len(detections[0]) == 0:
        return
    scores = list()
    for d in detections[2]:
        if isinstance(d, (list, tuple)) is True:
            scores.append(d[0])
        else:
            scores.append(d)
    normalized = self.normalize_weights(scores)
    for rect, score in zip(detections[0], normalized):
        coords = sppasCoords(rect[0], rect[1], rect[2], rect[3])
        coords.set_confidence(score)
        self._coords.append(coords)

Protected functions

__haar_detections

Detect objects using the Haar Cascade classifier.

This classifier already delete overlapped detections and too small

detected objects. Returned detections are a tuple with 3 values:

  • a list of N-list of rectangles;
  • a list of N times the same int value (why???);
  • a list of N weight values.

Notice that the scale factor has a big impact on the estimation time:

  • with 1.04 => 5.3x real time,
  • with 1.10 => 2.2x real time, but -6% of detected objects (relevant ones?)
Parameters
  • image: (sppasImage)
  • scale_factor: (float) how much the image size is reduced at each image scale
Returns
  • (numpy arrays)
View Source
def __haar_detections(self, image, scale_factor=1.06):
    """Detect objects using the Haar Cascade classifier.

        This classifier already delete overlapped detections and too small
        detected objects. Returned detections are a tuple with 3 values:
            - a list of N-list of rectangles;
            - a list of N times the same int value (why???);
            - a list of N weight values.

        Notice that the scale factor has a big impact on the estimation time:
            - with 1.04 => 5.3x real time,
            - with 1.10 => 2.2x real time, but -6% of detected objects (relevant ones?)

        :param image: (sppasImage)
        :param scale_factor: (float) how much the image size is reduced at each image scale
        :return: (numpy arrays)

        """
    w, h = image.size()
    min_w = int(float(w) * self.get_min_ratio())
    min_h = int(float(h) * self.get_min_ratio())
    try:
        detections = self._detector.detectMultiScale3(image, scaleFactor=scale_factor, minNeighbors=3, minSize=(min_w, min_h), flags=0, outputRejectLevels=True)
    except cv2.error as e:
        self._coords = list()
        raise sppasError('HaarCascadeClassifier detection failed: {}'.format(str(e)))
    return detections