通过构造径向无界的正定的Lyapunovh函数,研究一类具有有限分布延时的神经网络的全局耗散性,在假定延时可微的情况下,给出了几个确保网络全局耗散性的充分条件.最后.通过一个数值例子说明此条件的有效性.
In this paper, the global dissipativity of a class of neural networks with finite distributed delays is investigated by constructing a radically unbounded and positive definite Lyapunov function. Several sufficient conditions are given to guarantee global exponential dissipativity of the neural networks assuming the differentiability of delay. Last, an example is given to illustrate the applicability of the result.