Chunk incremental learning for cost-sensitive hinge loss support vector machine

Bin Gu, Xin Quan, Yunhua Gu, Victor S. Sheng, Guansheng Zheng

Research output: Contribution to journalArticlepeer-review

44 Scopus citations


Cost-sensitive learning can be found in many real-world applications and represents an important learning paradigm in machine learning. The recently proposed cost-sensitive hinge loss support vector machine (CSHL-SVM) guarantees consistency with the cost-sensitive Bayes risk, and this technique provides better generalization accuracy compared to traditional cost-sensitive support vector machines. In practice, data typically appear in the form of sequential chunks, also called an on-line scenario. However, conventional batch learning algorithms waste a considerable amount of time under the on-line scenario due to re-training of a model from scratch. To make CSHL-SVM more practical for the on-line scenario, we propose a chunk incremental learning algorithm for CSHL-SVM, which can update a trained model without re-training from scratch when incorporating a chunk of new samples. Our method is efficient because it can update the trained model for not only one sample at a time but also multiple samples at a time. Our experimental results on a variety of datasets not only confirm the effectiveness of CSHL-SVM but also show that our method is more efficient than the batch algorithm of CSHL-SVM and the incremental learning method of CSHL-SVM only for a single sample.

Original languageEnglish
Pages (from-to)196-208
Number of pages13
JournalPattern Recognition
StatePublished - Nov 2018


  • Chunk incremental learning
  • Cost-sensitive learning
  • Hinge loss
  • Support vector machines


Dive into the research topics of 'Chunk incremental learning for cost-sensitive hinge loss support vector machine'. Together they form a unique fingerprint.

Cite this