Bayes-optimal minimax probability machines

Sebastián Maldonado*, Julio López, Miguel Carrasco, Paul Bosch

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The Minimax Probability Machine (MPM) model, a robust machine learning method, has been proposed to address the stochastic nature of data distributions by minimizing the upper bound on classification error probabilities. However, the worst-case scenario assumption in traditional MPM models may not be ideal for datasets that more closely follow a multivariate Gaussian distribution. This paper seeks to bridge this gap by introducing Gaussian adaptations to two MPM variants: the Robust Maximum Margin Classifier (RMMC) and the Cobb–Douglas Learning Machine (CD-LeMa), resulting in three novel methodologies that function as optimal Bayes classifiers. An extensive empirical comparison of various MPM variants, considering both robust and Gaussian cases, provides valuable insights into their predictive capabilities. The proposed Gaussian versions of the MPM models achieve the best performance in 71% of cases, with the RMMC variant emerging as the top approach (an average rank of 2.32 among 10 classifiers, with 1 being the top-ranked method on a given dataset). The Gaussian variants excel primarily in medium-sized datasets (1000 samples or more) and, for the MEMPM, RMMC, and CD-LeMa methods, perform better than their robust counterparts in 90% of cases.

Original languageEnglish
Article number112068
JournalPattern Recognition
Volume171
DOIs
StatePublished - Mar 2026

Bibliographical note

Publisher Copyright:
© 2025 Elsevier Ltd

Keywords

  • Gaussian distribution
  • Minimax probability machine
  • Minimum error minimax probability machine
  • Second-order cone programming
  • Support vector machines

Fingerprint

Dive into the research topics of 'Bayes-optimal minimax probability machines'. Together they form a unique fingerprint.

Cite this