Abstract
Kernel methods are crucial in machine learning due to their ability to model nonlinear relationships in data. Among these, Support Vector Machine (SVM) is widely recognized for its robust performance and appealing optimization properties. In this work, we build upon recent advancements in SVM variants to propose five novel models specifically designed for multiclass learning. In particular, we introduce One-vs-One and One-vs-All versions of the nonparallel hyperplane SVM and improved twin SVM, along with a unified optimization variant (all-together) of the former method for nonlinear multiclass classification. Our empirical evaluation, conducted on 11 datasets and 12 multiclass classifiers, shows the superiority of our methods: four out of the five proposed models rank among the top performers and consistently outperform alternative approaches in terms of balanced accuracy. Additionally, a statistical test was performed, showing significant differences among the classifiers.
Original language | English |
---|---|
Article number | 053134 |
Journal | Chaos |
Volume | 35 |
Issue number | 5 |
DOIs | |
State | Published - 1 May 2025 |
Bibliographical note
Publisher Copyright:© 2025 Author(s).