Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Efficient implicit Lagrangian twin parametric insensitive support vector regression via unconstrained minimization problems

Efficient implicit Lagrangian twin parametric insensitive support vector regression via... In this paper, an efficient implicit Lagrangian twin parametric insensitive support vector regression is proposed which leads to a pair of unconstrained minimization problems, motivated by the works on twin parametric insensitive support vector regression (Peng: Neurocomputing. 79, 26–38, 2012), and Lagrangian twin support vector regression (Balasundaram and Tanveer: Neural Comput. Applic. 22(1), 257–267, 2013). Since its objective function is strongly convex, piece-wise quadratic and differentiable, it can be solved by gradient-based iterative methods. Notice that its objective function having non-smooth ‘plus’ function, so one can consider either generalized Hessian, or smooth approximation function to replace the ‘plus’ function and further apply the simple Newton-Armijo step size algorithm. These algorithms can be easily implemented in MATLAB and do not require any optimization toolbox. The advantage of this method is that proposed algorithms take less training time and can deal with data having heteroscedastic noise structure. To demonstrate the effectiveness of the proposed method, computational results are obtained on synthetic and real-world datasets which clearly show comparable generalization performance and improved learning speed in accordance with support vector regression, twin support vector regression, and twin parametric insensitive support vector regression. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Annals of Mathematics and Artificial Intelligence Springer Journals

Efficient implicit Lagrangian twin parametric insensitive support vector regression via unconstrained minimization problems

Loading next page...
 
/lp/springer-journals/efficient-implicit-lagrangian-twin-parametric-insensitive-support-u8sd9mU3f0

References (45)

Publisher
Springer Journals
Copyright
Copyright © Springer Nature Switzerland AG 2020
ISSN
1012-2443
eISSN
1573-7470
DOI
10.1007/s10472-020-09708-0
Publisher site
See Article on Publisher Site

Abstract

In this paper, an efficient implicit Lagrangian twin parametric insensitive support vector regression is proposed which leads to a pair of unconstrained minimization problems, motivated by the works on twin parametric insensitive support vector regression (Peng: Neurocomputing. 79, 26–38, 2012), and Lagrangian twin support vector regression (Balasundaram and Tanveer: Neural Comput. Applic. 22(1), 257–267, 2013). Since its objective function is strongly convex, piece-wise quadratic and differentiable, it can be solved by gradient-based iterative methods. Notice that its objective function having non-smooth ‘plus’ function, so one can consider either generalized Hessian, or smooth approximation function to replace the ‘plus’ function and further apply the simple Newton-Armijo step size algorithm. These algorithms can be easily implemented in MATLAB and do not require any optimization toolbox. The advantage of this method is that proposed algorithms take less training time and can deal with data having heteroscedastic noise structure. To demonstrate the effectiveness of the proposed method, computational results are obtained on synthetic and real-world datasets which clearly show comparable generalization performance and improved learning speed in accordance with support vector regression, twin support vector regression, and twin parametric insensitive support vector regression.

Journal

Annals of Mathematics and Artificial IntelligenceSpringer Journals

Published: Nov 19, 2020

There are no references for this article.