点到平面距离的解析表示对度量间隔、模式可分性起到决定性作用,该距离均可归结为范数最小化问题.除L2范数易于求解外,其他类型范数求解均困难.以L1范数为例,尽管L1范数问题是凸的,由于L1范数的不可导性,迄今尚无解析表示,所以目前的L1学习机并非从L1间隔导出.讨论了在L1赋范线性空间中,L1距离及在超平面上的投影解析计算问题,主要完成了:(1)导出了L1范数下的点到超平面距离以及点在平面上的投影的解析表达式;(2)证明了该投影与欧氏度量下的L2范数投影之间的关系,并给出了几何解释.最后通过模拟实验,验证解析解的正确性及计算效率.
Analytic solutions of distances between scatter points and the decision hyperplane play conclusive roles in measuring margin and pattern separability. Theoretically, such distances can he induced into norm minimization problems. However, compare to L2 norm, non-L2 norms problems are more complex. As far as L1 norm is concerned, the state-of-the-art L1 machines is NOT genuinely induced from L1 norm because of lacking of L1 closed solution. L1 minimization problem is convex,however,due to its non-differentiability, the leading problem has to be computed by time-consuming iteration methods. In this paper, we discuss how to analytically compute L1 distance and projection in linear space. Concretely, similar to L2 norm,we introduce two analytic formula for L1 distance and projection. Furthermore,we also prove relationship between L1 and L2 Euclidian distance,which can be described by a bilateral inequality. Finally,compared to linear programming methods, we show the effectiveness and efficiency of the proposal in some datasets.