对已有的N-gram平滑算法进行了系统地分析,分别实现了Absolute、W-B和Katz平滑算法.为解决传统Katz平滑算法在处理某些汉语固定搭配时无法进行概率折扣的问题,利用词性信息构造了新的折扣系数.新的折扣系数使词频越大,折扣越小,后接词越多,折扣越大,满足平滑算法对折扣系数的期望.试验结果表明:新的Katz平滑算法降低了N-gram模型的交叉熵,在汉语分词中应用改进的平滑算法也提高了分词结果的F量度.
This paper reviewed existing smoothing methods for N - gram model firstly, and implemented the Absolute, W - B and Katz smoothing algorithms respectively. Traditional Katz algorithm couldn' t discount the probability when it smoothed Chinese collocation. We constructed new discounting coefficient based on Part-of- Speech information to resolve this problem. Calculated by the new discounting coefficient, discount could decrease when word frequency increased, and the more count of following word, the more discount. All this satisfied demand of smoothing methods. Experiment result showed that improved Katz smoothing algorithm could not only decrease the cross entropy of language model, but also increase the F measure when applied to Chinese word segmentation.