推广性能是机器学习理论研究的主要目的之一。为了研究相依序列下采用ERM算法的学习机器的推广性能,本文基于β-混合序列建立了采用ERM算法的学习机器的经验风险到它的期望风险相对一致收敛速率的界。这个界不仅把基于独立序列下已有的结果推广到β-混合相依序列的情况,而且对β-混合相依序列现有的一些结论进行了改进。得到了β-混合相依序列下,采用ERM算法的学习机器的推广性能的界。
The generalization performance is the main purpose of machine learning theoretical research. To study the generalization ability of ERM algorithms with dependent observations in this paper, we derive the bounds on the rate of relative uniform convergence of the empirical risks to their expected risks with beta-mixing dependent sequences, which extend the previous results with the independent and identically distributed (i.i.d.) sequence, and improve current results with beta-mixing dependent sequence. We also establish the bound that describes the generalization ability of ERM algorithm with beta-mixing sequences.