메뉴 건너뛰기




Volumn 12, Issue 6, 2001, Pages 1521-1525

Qualitative analysis of a recurrent neural network for nonlinear continuously differentiable convex minimization over a nonempty closed convex subset

Author keywords

Closed convex subsets; Convex minimization; Global convergence; Global existence of solutions; Global exponential convergence; Recurrent neural networks (RNNs); Uniform convexity

Indexed keywords

ASYMPTOTIC STABILITY; COMPUTER SIMULATION; LYAPUNOV METHODS; OPTIMIZATION; QUADRATIC PROGRAMMING; SET THEORY; THEOREM PROVING;

EID: 0035506173     PISSN: 10459227     EISSN: None     Source Type: Journal    
DOI: 10.1109/72.963790     Document Type: Letter
Times cited : (5)

References (11)
  • 5
    • 0035506769 scopus 로고    scopus 로고
    • A recurrent neural network for nonlinear continuously differentiable optimization over a compact convex subset
    • Nov.
    • (2001) IEEE Trans. Neural Networks , vol.12 , pp. 1521-1524
    • Liang, X.B.1
  • 6
    • 0344704362 scopus 로고
    • Finite-dimensional variational inequality and nonlinear complementary problems: A survey of theory, algorithms, and applications
    • (1990) Math. Program. , vol.48 , pp. 161-220
    • Harker, P.T.1    Pang, J.S.2
  • 7
    • 0001170028 scopus 로고
    • A posterior error bounds for the linearly constrained variational inequality problem
    • (1987) Math. Operations Res. , vol.12 , Issue.3 , pp. 474-484
    • Pang, J.S.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.