Indexed by:
Abstract:
Recent reports show that projection neural networks with a low-dimensional state space can enhance computation speed obviously. This paper proposes two projection neural networks with reduced model dimension and complexity (RDPNNs) for solving nonlinear programming (NP) problems. Compared with existing projection neural networks for solving NP, the proposed two RDPNNs have a low-dimensional state space and low model complexity. Under the condition that the Hessian matrix of the associated Lagrangian function is positive semi-definite and positive definite at each Karush-Kuhn-Tucker point, the proposed two RDPNNs are proven to be globally stable in the sense of Lyapunov and converge globally to a point satisfying the reduced optimality condition of NP. Therefore, the proposed two RDPNNs are theoretically guaranteed to solve convex NP problems and a class of nonconvex NP problems. Computed results show that the proposed two RDPNNs have a faster computation speed than the existing projection neural networks for solving NP problems.
Keyword:
Reprint 's Address:
Version:
Source :
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN: 2162-237X
Year: 2020
Issue: 6
Volume: 31
Page: 2020-2029
1 0 . 4 5 1
JCR@2020
1 0 . 2 0 0
JCR@2023
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:149
JCR Journal Grade:1
CAS Journal Grade:1
Cited Count:
WoS CC Cited Count: 38
SCOPUS Cited Count: 34
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0
Affiliated Colleges: