首次将代数中的超群理论应用于粒计算研究之中。首先,引入正规超群和强正规超群的定义,证明了正规超群可由强正规超群生成;然后将粒计算商空间模型(X,f,T)中的T取为超群结构,利用超群同态证明了在模型(X,f,T)中,x与y在同一条路径上当且仅当在商空间模型([X],[f],[T])中,[x]与[y]在同一条路径上;并进一步证明了:若X与Y为超群同态的,则它们导出的商空间也是超群同态的。其次,我们研究了正规超群与可能性理论中的备域、超群与Paw lak近似空间及超群与拓扑空间的联系。指出:(1)强正规超群与备域是等价的;(2)强正规超群与Paw lak近似空间是等价的;(3)利用超群可定义集合的上、下近似,并利用集合的上、下近似刻画了超群同态;(4)强正规超群可由拓扑空间生成,正规超群可由拓扑空间生成的强正规超群生成;(5)可能性理论中的备域与Paw lak近似空间是等价的,且备域恰好是近似空间中所有可定义集合的全体。我们的研究表明:可能性理论中的备域与Paw lak的近似空间可利用正规超群来刻画。因此超群理论可用于粒计算的研究中。
This paper is first article to apply the theory of hypergroup in algebra into the research of granular computing.Firstly,we introduce the definitions of normal hypergroup and strong normal hypergroup,and show that each normal hypergroup can be generated by a strong normal hypergroup.Then by choosing T in the quotient space theory model(X,f,T) of granular computing as hypergroup structure,we prove that x and y are in the same path in the model(X,f,T) if and only if [x] and [y] are in the same path in quotient space model(,[T]) by the use of homomorphism of hypergroup;Further,we prove that if X and Y are homomorphism of hypergroup,then the quotient space induced by them are also homomorphism.Secondly,we develop the relationships between normal hypergroup and some theories such as ample field in possibility theory,Pawlak approximation space and topological space.We point out that(1) Strong normal hypergroup and ample field are equivalent;(2) Strong normal hypergroup and Pawlak approximation space are equivalent;(3) The upper and lower approximations of sets can be defined by hypergroup,and the homomorphism of hypergroup can be described by the upper and lower approximations;(4) Strong normal hypergroup can be generated by topological space,and normal hypergroup can be generated by the strong normal hypergroup generated by topological space;(5) The ample field in possibility theory and Pawlak approximation space are equivalent,and the ample field is just the set of all well defined sets.Hence,the theory of hypergroup can be applied to the research of granular computing.