本文综述学习理论的新进展:学习算法稳定性与泛化性的近期研究结果。对现有主要的稳定性研究框架,如假设稳定、逐点假设稳定、一致稳定、几乎处处稳定和CVEEEloo稳定等的异同进行了比较,并进而指出学习算法稳定性及泛化性研究存在的其它亟待解决的问题。
The recent developments and achievements on the learning theory: stability and generalization of learning algorithm are reviewed in this paper. The similarities and differences among the existing stability theory such as hypothesis stability, pointwise stability, uniform stability, almost everywhere stability and CVEEE1oo stability are compared. Furthermore, a series of open questions on stability and generalization are also discussed.