Double regularization methods for robust feature selection and SVM classification via DC programming

Julio López, Sebastián Maldonado, Miguel Carrasco

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

In this work, two novel formulations for embedded feature selection are presented. A second-order cone programming approach for Support Vector Machines is extended by adding a second regularizer to encourage feature elimination. The one- and the zero-norm penalties are used in combination with the Tikhonov regularization under a robust setting designed to correctly classify instances, up to a predefined error rate, even for the worst data distribution. The use of the zero norm leads to a nonconvex formulation, which is solved by using Difference of Convex (DC) functions, extending DC programming to second-order cones. Experiments on high-dimensional microarray datasets were performed, and the best performance was obtained with our approaches compared with well-known feature selection methods for Support Vector Machines.
Original languageAmerican English
Pages (from-to)377-389
Number of pages13
JournalInformation Sciences
Volume429
DOIs
StatePublished - 1 Mar 2018

Keywords

  • Dc algorithm
  • Second-order cone programming
  • Support vector machines
  • Zero norm

Fingerprint Dive into the research topics of 'Double regularization methods for robust feature selection and SVM classification via DC programming'. Together they form a unique fingerprint.

Cite this