메뉴 건너뛰기




Volumn 8, Issue C, 2001, Pages 381-407

Distributed asynchronous incremental subgradient methods

Author keywords

[No Author keywords available]

Indexed keywords


EID: 77956652909     PISSN: 1570579X     EISSN: None     Source Type: Book Series    
DOI: 10.1016/S1570-579X(01)80023-9     Document Type: Chapter
Times cited : (121)

References (26)
  • 1
    • 77956706893 scopus 로고    scopus 로고
    • The Ordered Subsets Mirror Descent Optimization Method and its Use for the Positron Emission Tomography Reconstruction
    • to appear
    • A. Ben-Tal, T. Margalit, and A. Nemirovski, The Ordered Subsets Mirror Descent Optimization Method and its Use for the Positron Emission Tomography Reconstruction, SIAM J. Optim., to appear.
    • SIAM J. Optim
    • Ben-Tal, A.1    Margalit, T.2    Nemirovski, A.3
  • 3
    • 0031285678 scopus 로고    scopus 로고
    • A New Class of Incremental Gradient Methods for Least Squares Problems
    • Bertsekas D.P. A New Class of Incremental Gradient Methods for Least Squares Problems. SIAM. J. on Optimization 7 (1997) 913-926
    • (1997) SIAM. J. on Optimization , vol.7 , pp. 913-926
    • Bertsekas, D.P.1
  • 7
    • 0032075427 scopus 로고    scopus 로고
    • Asynchronous Stochastic Approximation
    • Borkar V.S. Asynchronous Stochastic Approximation. SIAM J. on Optimization 36 (1998) 840-851
    • (1998) SIAM J. on Optimization , vol.36 , pp. 840-851
    • Borkar, V.S.1
  • 8
    • 84973041784 scopus 로고
    • Convergence Analysis of Parallel Backpropagation Algorithm for Neural Networks
    • Gaivoronski A.A. Convergence Analysis of Parallel Backpropagation Algorithm for Neural Networks. Opt. Meth. and Software 4 (1994) 117-134
    • (1994) Opt. Meth. and Software , vol.4 , pp. 117-134
    • Gaivoronski, A.A.1
  • 10
    • 84972990726 scopus 로고
    • A Class of Unconstrained Minimization Methods for Neural Network Training
    • Grippo L. A Class of Unconstrained Minimization Methods for Neural Network Training. Opt. Meth. and Software 4 (1994) 135-150
    • (1994) Opt. Meth. and Software , vol.4 , pp. 135-150
    • Grippo, L.1
  • 11
    • 0032256280 scopus 로고    scopus 로고
    • Efficient Lagrangian Relaxation Algorithms for Industry Size Job-Shop Scheduling Problems
    • Kaskavelis C.A., and Caramanis M.C. Efficient Lagrangian Relaxation Algorithms for Industry Size Job-Shop Scheduling Problems. IIE Trans. on Scheduling and Logistics 30 (1998) 1085-1097
    • (1998) IIE Trans. on Scheduling and Logistics , vol.30 , pp. 1085-1097
    • Kaskavelis, C.A.1    Caramanis, M.C.2
  • 12
    • 0001256023 scopus 로고
    • Decomposition into Functions in the Minimization Problem
    • Kibardin V.M. Decomposition into Functions in the Minimization Problem. Automation and Remote Control 40 (1980) 1311-1323
    • (1980) Automation and Remote Control , vol.40 , pp. 1311-1323
    • Kibardin, V.M.1
  • 13
    • 77956676296 scopus 로고    scopus 로고
    • K. C. Kiwiel and P. O. Lindberg, Parallel Subgradient Methods for Convex Optimization,submitted to the Proceedings of the March 2000 Haifa Workshop Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications, D. Butnariu Y. Censor, and S. Reich, Eds., Studies in Computational Mathematics, Elsevier, Amsterdam.
    • K. C. Kiwiel and P. O. Lindberg, Parallel Subgradient Methods for Convex Optimization,submitted to the Proceedings of the March 2000 Haifa Workshop "Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications", D. Butnariu Y. Censor, and S. Reich, Eds., Studies in Computational Mathematics, Elsevier, Amsterdam.
  • 14
    • 77956665728 scopus 로고    scopus 로고
    • Convergence of Approximate and Incremental Subgradient Methods forConvex Optimization
    • submitted to
    • K. C. Kiwiel, Convergence of Approximate and Incremental Subgradient Methods forConvex Optimization, submitted to SIAM J. on Optimization.
    • SIAM J. on Optimization
    • Kiwiel, K.C.1
  • 16
    • 0001518167 scopus 로고
    • On the Convergence of the LMS Algorithm with Adaptive Learning Rate for Linear Feedforward Networks
    • Luo Z.Q. On the Convergence of the LMS Algorithm with Adaptive Learning Rate for Linear Feedforward Networks. Neural Computation 3 (1991) 226-245
    • (1991) Neural Computation , vol.3 , pp. 226-245
    • Luo, Z.Q.1
  • 17
    • 84973041797 scopus 로고
    • Analysis of an Approximate Gradient Projection Method with Applications to the Backpropagation Algorithm
    • Luo Z.Q., and Tseng P. Analysis of an Approximate Gradient Projection Method with Applications to the Backpropagation Algorithm. Opt. Meth. and Software 4 (1994) 85-101
    • (1994) Opt. Meth. and Software , vol.4 , pp. 85-101
    • Luo, Z.Q.1    Tseng, P.2
  • 18
    • 84972916837 scopus 로고
    • Serial and Parallel Backpropagation Convergence Via Nonmonotone Perturbed Minimization
    • Mangasarian O.L., and Solodov M.V. Serial and Parallel Backpropagation Convergence Via Nonmonotone Perturbed Minimization. Opt. Meth. and Software 4 (1994) 103-116
    • (1994) Opt. Meth. and Software , vol.4 , pp. 103-116
    • Mangasarian, O.L.1    Solodov, M.V.2
  • 19
    • 0010006173 scopus 로고    scopus 로고
    • Incremental Subgradient Methods for Nondifferentiable Optimization
    • Massachusetts Institute of Technology, Cambridge, MA
    • Nedić A., and Bertsekas D.P. Incremental Subgradient Methods for Nondifferentiable Optimization. Lab. for Info. and Decision Systems Report LIDS-P-2460 (1999), Massachusetts Institute of Technology, Cambridge, MA
    • (1999) Lab. for Info. and Decision Systems Report LIDS-P-2460
    • Nedić, A.1    Bertsekas, D.P.2
  • 20
    • 23044530780 scopus 로고    scopus 로고
    • Incremental Subgradient Methods for Nondifferentiable Optimization
    • submitted to
    • A. Nedić and D. P. Bertsekas, Incremental Subgradient Methods for Nondifferentiable Optimization, submitted to SIAM J. Optimization.
    • SIAM J. Optimization
    • Nedić, A.1    Bertsekas, D.P.2
  • 21
    • 0005422061 scopus 로고    scopus 로고
    • Convergence Rate of Incremental Subgradient Algorithms
    • to appear in Stochastic Optimization: Algorithms and Applications. Uryasev S., and Pardalos P.M. (Eds), Massachusetts Institute of Technology, Cambridge, MA
    • to appear in Stochastic Optimization: Algorithms and Applications. Nedić A., and Bertsekas D.P. Convergence Rate of Incremental Subgradient Algorithms. In: Uryasev S., and Pardalos P.M. (Eds). Lab. for Info. and Decision Systems Report LIDS-P-2475 (2000), Massachusetts Institute of Technology, Cambridge, MA
    • (2000) Lab. for Info. and Decision Systems Report LIDS-P-2475
    • Nedić, A.1    Bertsekas, D.P.2
  • 22
    • 0032345479 scopus 로고    scopus 로고
    • Error Stability Properties of Generalized Gradient-Type Algorithms
    • Solodov M.V., and Zavriev S.K. Error Stability Properties of Generalized Gradient-Type Algorithms. J. of Opt. Theory and Applications 98 (1998) 663-680
    • (1998) J. of Opt. Theory and Applications , vol.98 , pp. 663-680
    • Solodov, M.V.1    Zavriev, S.K.2
  • 23
    • 0032222083 scopus 로고    scopus 로고
    • An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule
    • Tseng P. An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule. SIAM J. on Optimization 2 (1998) 506-531
    • (1998) SIAM J. on Optimization , vol.2 , pp. 506-531
    • Tseng, P.1
  • 24
    • 0022783899 scopus 로고
    • Distributed Asynchronous Deterministic and Stochastic Gradient Optimization Algorithms
    • Tsitsiklis J.N., Bertsekas D.P., and Athans M. Distributed Asynchronous Deterministic and Stochastic Gradient Optimization Algorithms. IEEE Trans. on Automatic Control AC-31 (1986) 803-812
    • (1986) IEEE Trans. on Automatic Control , vol.AC-31 , pp. 803-812
    • Tsitsiklis, J.N.1    Bertsekas, D.P.2    Athans, M.3
  • 26
    • 0033247536 scopus 로고    scopus 로고
    • Surrogate Gradient Algorithm for Lagrangian Relaxation
    • Zhao X., Luh P.B., and Wang J. Surrogate Gradient Algorithm for Lagrangian Relaxation. J. of Opt. Theory and Applications 100 (1999) 699-712
    • (1999) J. of Opt. Theory and Applications , vol.100 , pp. 699-712
    • Zhao, X.1    Luh, P.B.2    Wang, J.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.