Timezone: »
Even with the advent of more sophisticated, data-hungry methods, boosted decision trees remain extraordinarily successful for fast rigid object detection, achieving top accuracy on numerous datasets. While effective, most boosted detectors use decision trees with orthogonal (single feature) splits, and the topology of the resulting decision boundary may not be well matched to the natural topology of the data. Given highly correlated data, decision trees with oblique (multiple feature) splits can be effective. Use of oblique splits, however, comes at considerable computational expense. Inspired by recent work on discriminative decorrelation of HOG features, we instead propose an efficient feature transform that removes correlations in local neighborhoods. The result is an overcomplete but locally decorrelated representation ideally suited for use with orthogonal decision trees. In fact, orthogonal trees with our locally decorrelated features outperform oblique trees trained over the original features at a fraction of the computational cost. The overall improvement in accuracy is dramatic: on the Caltech Pedestrian Dataset, we reduce false positives nearly tenfold over the previous state-of-the-art.
Author Information
Woonhyun Nam (StradVision)
Piotr Dollar (Facebook AI Research)
Joon Hee Han (POSTECH)
More from the Same Authors
-
2021 Poster: Early Convolutions Help Transformers See Better »
Tete Xiao · Mannat Singh · Eric Mintun · Trevor Darrell · Piotr Dollar · Ross Girshick -
2015 Poster: Learning to Segment Object Candidates »
Pedro O. Pinheiro · Ronan Collobert · Piotr Dollar -
2015 Spotlight: Learning to Segment Object Candidates »
Pedro O. Pinheiro · Ronan Collobert · Piotr Dollar -
2006 Poster: Learning to Traverse Image Manifolds »
Piotr Dollar · Vincent Rabaud · Serge Belongie