Back to Top

The More you Learn, the Less you Store: Memory-Controlled Incremental SVM

A. Pronobis, B. Caputo

In ECCV 2006 2nd International Cognitive Vision Workshop (ICVW), 2006.

About

The capability to learn from experience and update incrementally its internal representation is a key property for a visual recognition algorithm aiming to work in real- -world scenarios. This paper presents a novel SVM-based algorithm for visual object recognition, capable of learning model representations incrementally. We combine an incremental extension of SVMs with a method which reduces the number of support vectors needed to build the decision function without any loss in performance. The resulting algorithm is guaranteed to achieve the same recognition performance as the original incremental method while reducing the memory requirements. We then introduce a parameter which permits a user-set trade-off between performance and memory reduction. This property is potentially useful in applications where the memory size of the visual models must be kept under control. Results show that it is possible to achieve a consistent reduction of the memory requirements with only a moderate loss in performance. For example, experiments show that when the user accepts a reduction in recognition rate of 5%, this yields a memory reduction of up to 50%.

BibTeX

@inproceedings{pronobis2006eccv-icvw,
  author =       {Pronobis, Andrzej and Caputo, Barbara},
  title =        {The More you Learn, the Less you Store: Memory-Controlled Incremental {SVM}},
  booktitle =    {ECCV 2006 2nd International Cognitive Vision Workshop (ICVW)},
  year =         2006,
  month =        may,
  address =      {Graz, Austria},
  url =          {http://www.pronobis.pro/publications/pronobis2006eccv-icvw}
}
© 2018. Copyright Andrzej Pronobis
  • stackoverflow
  • scholar