与批处理的学习算法相比较,在线预测的算法由于在大样本集上具有预测准确性高、时间的空间复杂度小,因此占有非常大的优势。对于Warmuth与Jivinen提出的保守性和权衡正确性的在线学习框架,已经得到相当广泛的应用,但是在他们所提出的指数梯度下降算法以及梯度下降算法中,对于目标函数中的损失函数,在其求导过程中使用近似步骤将会造成在线学习结果的恶化。根据对偶的最优化理论,基于不同损失函数与距离函数,提出了变更新分类算法。通过一系列的实验分析与研究,结果表明,该算法使得预测准确率得到了很大的提高,从而验证了该算法的正确性、可行性和有效性。
Abstract: In comparison with batch learning algorithm, online prediction algorithms on large sample sets with high predictive accuracy, low time and space complexity, and therefore, that plays a very big advantage. The conservative and weigh the cor- rectness online learning framework proposed by Warmuth and Jivinen, had gotten a wide range of applications. But in the the Warmuth & Jivinen' s index gradient down algorithms and gradient descent algorithm, for the loss function in the objective function, the approximate steps would result in the deterioration of the resuhs of the online learning during the derivation process. Based on the dual optimization theory, this paper proposed changes to the new classification algorithm based on a va- riety of loss function and distance function. Through a series of experimental analysis and research results show that the algo- rithm makes the prediction accuracy better, and verifies the correctness, feasibility and effectiveness of the algorithm.