Discriminant kernel and regularization parameter learning via semidefinite programming Conference Paper uri icon


  • Regularized Kernel Discriminant Analysis (RKDA) performs linear discriminant analysis in the feature space via the kernel trick. The performance of RKDA depends on the selection of kernels. In this paper, we consider the problem of learning an optimal kernel over a convex set of kernels. We show that the kernel learning problem can be formulated as a semidefinite program (SDP) in the binary-class case. We further extend the SDP formulation to the multi-class case. It is based on a key result established in this paper, that is, the multi-class kernel learning problem can be decomposed into a set of binary-class kernel learning problems. In addition, we propose an approximation scheme to reduce the computational complexity of the multi-class SDP formulation. The performance of RKDA also depends on the value of the regularization parameter. We show that this value can be learned automatically in the framework. Experimental results on benchmark data sets demonstrate the efficacy of the proposed SDP formulations.

name of conference

  • Proceedings of the 24th international conference on Machine learning

published proceedings

  • Proceedings of the 24th international conference on Machine learning

author list (cited authors)

  • Ye, J., Chen, J., & Ji, S.

citation count

  • 14

complete list of authors

  • Ye, Jieping||Chen, Jianhui||Ji, Shuiwang

publication date

  • January 2007