In this work, a novel method called epsilon-nonparallel support vector regression (ε-NPSVR) is proposed. The reasoning behind the nonparallel support vector machine (NPSVM) method for binary classification is extended for predicting numerical outputs. Our proposal constructs two nonparallel hyperplanes in such a way that each one is closer to one of the training patterns, and as far as possible from the other. Two epsilon-insensitive tubes are also built for providing a better alignment for each hyperplane with their respective training pattern, which are obtained by shifting the regression function up and down by two fixed parameters. Our proposal shares the methodological advantages of NPSVM: A kernel-based formulation can be derived directly by applying the duality theory; each twin problem has the same structure of the SVR method, allowing the use of efficient optimization algorithms for fast training; it provides a generalized formulation for twin SVR; and it leads to better performance compared with the original TSVR. This latter advantage is confirmed by our experiments on well-known benchmark datasets for the regression task.
Bibliographical noteFunding Information:
This research was partially funded by CONICYT, FONDECYT projects 1160894 and 1160738, and by the Complex Engineering Systems Institute (CONICYT, PIA, FB0816).
© 2019, Springer Science+Business Media, LLC, part of Springer Nature.
- Nonparallel support vector machines
- Support vector regression
- Twin support vector regression