支持向量机的核函数一直是影响其学习效果的重要因素.本文基于小波分解理论和支持向量机核函数的条件,提出一种多维允许支持向量尺度核函数.该核函数不仅具有平移正交性,且可以以其正交性逼近二次可积空间上的任意曲线,从而提升支持向量机的泛化性能.在尺度函数作为支持向量核函数的基础之上,提出基于尺度核函数的最小二乘支持向量机(LS-SSVM).实验结果表明,LS—SSVM在同等条件下比传统支持向量机的学习精度更高,因而更适用于复杂函数的学习问题.
The kernel function of support vector machine (SVM) is an important factor to the learning result of SVM. Based on the wavelet decomposition and conditions of the support vector kernel function, a new scaling kernel function for SVM (SSVM) is proposed. This function is not only a kind of horizontal floating orthonormal function, but also can simulate any curve in quadratic continuous integral space, thus it enhances the generalization ability of the SVM. According to the scaling kernel function and the regularization theory, a least squares support vector machine on scaling kernel function (LS-SSVM) is proposed to simplify the solving process of SSVM. The LS-SSVM is then applied to the regression analysis and classification. Experimental results show that the precision of regression is improved, compared with LS-SVM whose kernel function is Gauss function.