"Imbalanced data classification using second-order cone programming support vector machines"

Research areas:
  • Uncategorized
Year:
2014
Type of Publication:
Article
Keywords:
Class imbalanced data, Support Vector Machines, LP SVM, SOCP SVM
Authors:
  • S. Maldonado
  • J. López
Journal:
Pattern Recognition
Volume:
47
Number:
5
Pages:
2070-2079
Month:
May
ISSN:
0031-3203
Abstract:
Learning from imbalanced data sets is an important machine learning challenge, especially in Support Vector Machines (SVM), where the assumption of equal cost of errors is made and each object is treated independently. Second-order cone programming SVM (SOCP-SVM) studies each class separately instead, providing quite an interesting formulation for the imbalanced classification task. This work presents a novel second-order cone programming (SOCP) formulation, based on the LP-SVM formulation principle: the bound of the VC dimension is loosened properly using the l∞-norm, and the margin is directly maximized using two margin variables associated with each class. A regularization parameter C is considered in order to control the trade-off between the maximization of these two margin variables. The proposed method has the following advantages: it provides better results, since it is specially designed for imbalanced classification, and it reduces computational complexity, since one conic restriction is eliminated. Experiments on benchmark imbalanced data sets demonstrate that our approach accomplishes the best classification performance, compared with the traditional SOCP-SVM formulation and with cost-sensitive formulations for linear SVM.