Learning the Kernel Matrix in Discriminant Analysis via Quadratically Constrained Quadratic Programming Conference Paper uri icon

abstract

  • The kernel function plays a central role in kernel methods. In this paper, we consider the automated learning of the kernel matrix over a convex combination of pre-specified kernel matrices in Regularized Kernel Discriminant Analysis (RKDA), which performs lineardiscriminant analysis in the feature space via the kernel trick. Previous studies have shown that this kernel learning problem can be formulated as a semidefinite program (SDP), which is however computationally expensive, even with the recent advances in interior point methods. Based on the equivalence relationship between RKDA and least square problems in the binary-class case, we propose a Quadratically Constrained Quadratic Programming (QCQP) formulation for the kernel learning problem, which can be solved more efficiently than SDP. While most existing work on kernel learning deal with binary-class problems only, we show that our QCQP formulation can be extended naturally to the multi-class case. Experimental results on both binary-class and multi-class benchmarkdata sets show the efficacy of the proposed QCQP formulations. 2007 ACM.

name of conference

  • Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining

published proceedings

  • KDD-2007 PROCEEDINGS OF THE THIRTEENTH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING

author list (cited authors)

  • Ye, J., Ji, S., & Chen, J.

citation count

  • 13

complete list of authors

  • Ye, Jieping||Ji, Shuiwang||Chen, Jianhui

publication date

  • August 2007