Structural minimax probability machine

Bin Gu, Xingming Sun, Victor S. Sheng

Research output: Contribution to journalArticlepeer-review

261 Scopus citations

Abstract

Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

Original languageEnglish
Article number7452660
Pages (from-to)1646-1656
Number of pages11
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume28
Issue number7
DOIs
StatePublished - Jul 2017

Keywords

  • Bayes learning
  • finite mixture models
  • kernel methods
  • second-order cone programming (SOCP)
  • structural learning

Fingerprint

Dive into the research topics of 'Structural minimax probability machine'. Together they form a unique fingerprint.

Cite this