메뉴 건너뛰기




Volumn 33, Issue 2, 2010, Pages 279-287

Research on extreme learning of neural networks

Author keywords

Extreme learning machine; Least square; Neural network; Regularized extreme learning machine; Structural risk; Support vector machine

Indexed keywords

AUTOMATIC CONTROL; EXTREME LEARNING MACHINE; GRADIENT DESCENT; HIDDEN LAYERS; LEAST SQUARE; LOCAL MINIMUMS; MULTIPLE ITERATIONS; NETWORK PARAMETERS; NOVEL ALGORITHM; NUMBER OF ITERATIONS; REGULARIZED EXTREME LEARNING MACHINE; SEARCHING SPACES; STRUCTURAL RISK; STRUCTURAL RISK MINIMIZATION; TRADITIONAL LEARNING; WEIGHTED LEAST SQUARES;

EID: 77950553517     PISSN: 02544164     EISSN: None     Source Type: Journal    
DOI: 10.3724/SP.J.1016.2010.00279     Document Type: Article
Times cited : (142)

References (18)
  • 1
    • 0025751820 scopus 로고
    • Approximation capabilities of multilayer feedforward networks
    • Hornik K. Approximation capabilities of multilayer feedforward networks. Neural Networks, 1991, 4(2): 251-257.
    • (1991) Neural Networks , vol.4 , Issue.2 , pp. 251-257
    • Hornik, K.1
  • 2
    • 0027262895 scopus 로고
    • Multilayer feedforward networks with a nonpolynomial activation function can approximate any function
    • Leshno M, Lin V Y, Pinkus A, Schocken S. Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Networks, 1993, 6(6): 861-867.
    • (1993) Neural Networks , vol.6 , Issue.6 , pp. 861-867
    • Leshno, M.1    Lin, V.Y.2    Pinkus, A.3    Schocken, S.4
  • 3
    • 0031673055 scopus 로고    scopus 로고
    • Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions
    • Huang G-B, Babri H A. Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions. IEEE Transactions on Neural Networks, 1998, 9(1): 224-229.
    • (1998) IEEE Transactions on Neural Networks , vol.9 , Issue.1 , pp. 224-229
    • Huang, G.-B.1    Babri, H.A.2
  • 4
    • 0037361264 scopus 로고    scopus 로고
    • Learning capability and storage capacity of two hidden-layer feedforward networks
    • Huang G-B. Learning capability and storage capacity of two hidden-layer feedforward networks. IEEE Transactions on Neural Networks, 2003, 14(2): 274-281.
    • (2003) IEEE Transactions on Neural Networks , vol.14 , Issue.2 , pp. 274-281
    • Huang, G.-B.1
  • 5
    • 33745903481 scopus 로고    scopus 로고
    • Extreme learning machine: Theory and applications
    • Huang G-B, Zhu Q-Y, Siew C-K. Extreme learning machine: Theory and applications. Neurocomputing, 2006, 70(1-3): 489-501.
    • (2006) Neurocomputing , vol.70 , Issue.1-3 , pp. 489-501
    • Huang, G.-B.1    Zhu, Q.-Y.2    Siew, C.-K.3
  • 10
    • 0031100287 scopus 로고    scopus 로고
    • Capabilities of a four-layered feedforward neural network: Four layers versus three
    • Tamura S, Tateishi M. Capabilities of a four-layered feedforward neural network: Four layers versus three. IEEE Transactions on Neural Networks, 1997, 8(2): 251-255.
    • (1997) IEEE Transactions on Neural Networks , vol.8 , Issue.2 , pp. 251-255
    • Tamura, S.1    Tateishi, M.2
  • 13
    • 0002343859 scopus 로고    scopus 로고
    • Introduction to statistical learning theory and support vector machines
    • in Chinese
    • Zhang Xue-Gong. Introduction to statistical learning theory and support vector machines. Acta Automatica Sinica, 2000, 26(1): 32-42(in Chinese).
    • (2000) Acta Automatica Sinica , vol.26 , Issue.1 , pp. 32-42
    • Zhang, X.-G.1
  • 14
    • 0013221347 scopus 로고    scopus 로고
    • Early sample measures of variability
    • David H A. Early sample measures of variability. Statistical Science, 1998, 13(4): 368-377.
    • (1998) Statistical Science , vol.13 , Issue.4 , pp. 368-377
    • David, H.A.1
  • 15
    • 0036825528 scopus 로고    scopus 로고
    • Weighted least squares support vector machines: Robustness and sparse approximation
    • Suykens J A K, De Brabanter J, Lukas L, Vandewaile J. Weighted least squares support vector machines: Robustness and sparse approximation. Neurocomputing, 2002, 48(1): 85-105.
    • (2002) Neurocomputing , vol.48 , Issue.1 , pp. 85-105
    • Suykens, J.A.K.1    de Brabanter, J.2    Lukas, L.3    Vandewaile, J.4
  • 16
    • 0036505670 scopus 로고    scopus 로고
    • A comparison of methods for multiclass support vector machines
    • Hsu C-W, Lin C-J. A comparison of methods for multiclass support vector machines. IEEE Transactions on Neural Networks, 2002, 13(2): 415-425.
    • (2002) IEEE Transactions on Neural Networks , vol.13 , Issue.2 , pp. 415-425
    • Hsu, C.-W.1    Lin, C.-J.2
  • 17
    • 0036583160 scopus 로고    scopus 로고
    • A parallel mixtures of SVMs for very large scale problems
    • Collobert R, Bengio S, Bengio Y. A parallel mixtures of SVMs for very large scale problems. Neural Computation, 2002, 14(5): 1105-1114.
    • (2002) Neural Computation , vol.14 , Issue.5 , pp. 1105-1114
    • Collobert, R.1    Bengio, S.2    Bengio, Y.3
  • 18
    • 0037313407 scopus 로고    scopus 로고
    • SMO algorithm for least squares SVM formulations
    • Keerthi S S, Shevade S K. SMO algorithm for least squares SVM formulations. Neural Computation, 2003, 15(2): 487-507.
    • (2003) Neural Computation , vol.15 , Issue.2 , pp. 487-507
    • Keerthi, S.S.1    Shevade, S.K.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.