The Cobb-Douglas Learning Machine

Sebastián Maldonado*, Julio López, Miguel Carrasco

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

In this paper, we propose a novel machine learning approach based on robust optimization. Our proposal defines the task of maximizing the two class accuracies of a binary classification problem as a Cobb-Douglas function. This function is well known in production economics and is used to model the relationship between two or more inputs as well as the quantity produced by those inputs. A robust optimization problem is defined to construct the decision function. The goal of the model is to classify each training pattern correctly, up to a given class accuracy, even for the worst possible data distribution. We demonstrate the theoretical advantages of the Cobb-Douglas function in terms of the properties of the resulting second-order cone programming problem. Important extensions are proposed and discussed, including the use of kernel functions and regularization. Experiments performed on several classification datasets confirm these advantages, leading to the best average performance in comparison to various alternative classifiers.

Original languageEnglish
Article number108701
JournalPattern Recognition
Volume128
DOIs
StatePublished - Aug 2022

Bibliographical note

Funding Information:
Financial support from ANID Chile: FONDECYT, grants 1200221 and 1201403, PIA/BASAL AFB180003, and STIC-AMSUD 22-STIC-09 is gratefully acknowledged by the authors.
Publisher Copyright:
© 2022 Elsevier Ltd

Keywords

  • Cobb-Douglas
  • Minimax Probability Machine
  • Minimum Error Minimax Probability Machine
  • Second-order Cone Programming
  • Support Vector Machines

Fingerprint

Dive into the research topics of 'The Cobb-Douglas Learning Machine'. Together they form a unique fingerprint.

Cite this