TY - JOUR
T1 - The Cobb-Douglas Learning Machine
AU - Maldonado, Sebastián
AU - López, Julio
AU - Carrasco, Miguel
N1 - Funding Information:
Financial support from ANID Chile: FONDECYT, grants 1200221 and 1201403, PIA/BASAL AFB180003, and STIC-AMSUD 22-STIC-09 is gratefully acknowledged by the authors.
Publisher Copyright:
© 2022 Elsevier Ltd
PY - 2022/8
Y1 - 2022/8
N2 - In this paper, we propose a novel machine learning approach based on robust optimization. Our proposal defines the task of maximizing the two class accuracies of a binary classification problem as a Cobb-Douglas function. This function is well known in production economics and is used to model the relationship between two or more inputs as well as the quantity produced by those inputs. A robust optimization problem is defined to construct the decision function. The goal of the model is to classify each training pattern correctly, up to a given class accuracy, even for the worst possible data distribution. We demonstrate the theoretical advantages of the Cobb-Douglas function in terms of the properties of the resulting second-order cone programming problem. Important extensions are proposed and discussed, including the use of kernel functions and regularization. Experiments performed on several classification datasets confirm these advantages, leading to the best average performance in comparison to various alternative classifiers.
AB - In this paper, we propose a novel machine learning approach based on robust optimization. Our proposal defines the task of maximizing the two class accuracies of a binary classification problem as a Cobb-Douglas function. This function is well known in production economics and is used to model the relationship between two or more inputs as well as the quantity produced by those inputs. A robust optimization problem is defined to construct the decision function. The goal of the model is to classify each training pattern correctly, up to a given class accuracy, even for the worst possible data distribution. We demonstrate the theoretical advantages of the Cobb-Douglas function in terms of the properties of the resulting second-order cone programming problem. Important extensions are proposed and discussed, including the use of kernel functions and regularization. Experiments performed on several classification datasets confirm these advantages, leading to the best average performance in comparison to various alternative classifiers.
KW - Cobb-Douglas
KW - Minimax Probability Machine
KW - Minimum Error Minimax Probability Machine
KW - Second-order Cone Programming
KW - Support Vector Machines
UR - http://www.scopus.com/inward/record.url?scp=85128401200&partnerID=8YFLogxK
U2 - 10.1016/j.patcog.2022.108701
DO - 10.1016/j.patcog.2022.108701
M3 - Article
AN - SCOPUS:85128401200
SN - 0031-3203
VL - 128
JO - Pattern Recognition
JF - Pattern Recognition
M1 - 108701
ER -