Abstract
In this paper, we propose novel second-order cone programming formulations for binary classification, by extending the Minimax Probability Machine (MPM) approach. Inspired by Support Vector Machines, a regularization term is included in the MPM and Minimum Error Minimax Probability Machine (MEMPM) methods. This inclusion reduces the risk of obtaining ill-posed estimators, stabilizing the problem, and, therefore, improving the generalization performance. Our approaches are first derived as linear methods, and subsequently extended as kernel-based strategies for nonlinear classification. Experiments on well-known binary classification datasets demonstrate the virtues of the regularized formulations in terms of predictive performance.
Original language | English |
---|---|
Pages (from-to) | 127-135 |
Number of pages | 9 |
Journal | Knowledge-Based Systems |
Volume | 177 |
DOIs | |
State | Published - 1 Aug 2019 |
Bibliographical note
Funding Information:The first author was supported by FONDECYT, Chile project 1160738, while the second and third authors were supported by FONDECYT, Chile project 1160894. This research was partially funded by the Complex Engineering Systems Institute, ISCI, Chile ( ICM-FIC: P05-004-F , CONICYT: FB0816 ). The authors would like to thank the anonymous reviewers for their valuable comments and suggestions.
Funding Information:
The first author was supported by FONDECYT, Chile project 1160738, while the second and third authors were supported by FONDECYT, Chile project 1160894. This research was partially funded by the Complex Engineering Systems Institute, ISCI, Chile (ICM-FIC: P05-004-F, CONICYT: FB0816). The authors would like to thank the anonymous reviewers for their valuable comments and suggestions.
Publisher Copyright:
© 2019 Elsevier B.V.
Keywords
- Minimax probability machine
- Regularization
- Second-order cone programming
- Support vector machines