TY - JOUR

T1 - Structural minimax probability machine

AU - Gu, Bin

AU - Sun, Xingming

AU - Sheng, Victor S.

N1 - Funding Information:
This work was supported in part by the National Natural Science Foundation of China under Grant 61232016, Grant 61202137, Grant U1536206, Grant U1405254, Grant 61573191, and Grant 61572259, in part by the U.S. National Science Foundation under Grant IIS-1115417, in part by the Jiangsu Provincial Key Laboratory of Big Data Analysis Techniques under Project KXK1405, and in part by the Priority Academic Program Development within the Jiangsu Higher Education Institutions.
Publisher Copyright:
© 2012 IEEE.

PY - 2017/7

Y1 - 2017/7

N2 - Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

AB - Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

KW - Bayes learning

KW - finite mixture models

KW - kernel methods

KW - second-order cone programming (SOCP)

KW - structural learning

UR - http://www.scopus.com/inward/record.url?scp=84963962002&partnerID=8YFLogxK

U2 - 10.1109/TNNLS.2016.2544779

DO - 10.1109/TNNLS.2016.2544779

M3 - Article

C2 - 27101618

AN - SCOPUS:84963962002

VL - 28

SP - 1646

EP - 1656

JO - IEEE Transactions on Neural Networks and Learning Systems

JF - IEEE Transactions on Neural Networks and Learning Systems

SN - 2162-237X

IS - 7

M1 - 7452660

ER -