A New Algorithm for Fast Discriminative Training
Currently, most discriminative training algorithms for nonlinear classifier designs are based on gradient-descent (GD) methods for cost minimization. These algorithms are easy to derive and effective in practice, but are slow in training speed. To address the problem, we proposed a fast discriminative algorithm, which takes advantage of the insight from the expectation-maximization (EM) algorithm. EM algorithm is known to be efficient for maximum likelihood estimation, particularly for the class of distributions that involve general Gaussian mixtures; it is, however, not readily applicable in discriminative training. Our contribution in this report is to present a new way to formulate the cost minimization process and, through which, to propose a solution process that can be efficiently implemented with the desired result as promised by discriminative training.