메뉴 건너뛰기




Volumn 22, Issue 10, 2011, Pages 1588-1598

A new formulation for feedforward neural networks

Author keywords

Feedforward neural networks; generalization; geometrical interpretation; internal behavior; measure of regularization; reformulated neural network; training

Indexed keywords

BLACK-BOX MODEL; DERIVATIVE-FREE OPTIMIZATION; ERROR RESPONSE; FUNCTION APPROXIMATION TECHNIQUES; GENERALIZATION; GENERALIZATION ABILITY; GEOMETRICAL INTERPRETATION; INTERNAL BEHAVIOR; LEARNING ABILITIES; MEASURE OF REGULARIZATION; MULTIPLE TEST; NETWORK WEIGHTS; TRAINING METHODS;

EID: 80053625918     PISSN: 10459227     EISSN: None     Source Type: Journal    
DOI: 10.1109/TNN.2011.2163169     Document Type: Review
Times cited : (111)

References (35)
  • 1
    • 0024880831 scopus 로고
    • Multilayer feedforward networks are universal approximators
    • DOI 10.1016/0893-6080(89)90020-8
    • K. Hornik, M. Stinchcombe, and H. White, "Multilayer feedforward networks are universal approximators," Neural Netw., vol. 2, no. 5, pp. 359-366, 1989. (Pubitemid 20609008)
    • (1989) Neural Networks , vol.2 , Issue.5 , pp. 359-366
    • Hornik Kurt1    Stinchcombe Maxwell2    White Halbert3
  • 2
    • 0029343809 scopus 로고
    • Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems
    • Jul.
    • T. P. Chen and H. Chen, "Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems," IEEE Trans. Neural Netw., vol. 6, no. 4, pp. 911-917, Jul. 1995.
    • (1995) IEEE Trans. Neural Netw. , vol.6 , Issue.4 , pp. 911-917
    • Chen, T.P.1    Chen, H.2
  • 3
    • 0027262895 scopus 로고
    • Multilayer feedforward networks with a nonpolynomial activation function can approximate any function
    • M. Leshno, V. Y. Lin, A. Pinkus, and S. Schocken, "Multilayer feedforward networks with a nonpolynomial activation function can approximate any function," Neural Netw., vol. 6, no. 6, pp. 861-867, 1993. (Pubitemid 23713273)
    • (1993) Neural Networks , vol.6 , Issue.6 , pp. 861-867
    • Leshno Moshe1    Lin Vladimir, Ya.2    Pinkus Allan3    Schocken Shimon4
  • 4
    • 0027698748 scopus 로고
    • Approximations of continuous functionals by neural networks with application to dynamic systems
    • Nov.
    • T. P. Chen and H. Chen, "Approximations of continuous functionals by neural networks with application to dynamic systems," IEEE Trans. Neural Netw., vol. 4, no. 6, pp. 910-918, Nov. 1993.
    • (1993) IEEE Trans. Neural Netw. , vol.4 , Issue.6 , pp. 910-918
    • Chen, T.P.1    Chen, H.2
  • 5
    • 0029207175 scopus 로고
    • Approximation capability in C(.Rn ) by multilayer feedforward networks and related problems
    • Jan.
    • T. P. Chen, H. Chen, and R. W. Liu, "Approximation capability in C(.Rn ) by multilayer feedforward networks and related problems," IEEE Trans. Neural Netw., vol. 6, no. 1, pp. 25-30, Jan. 1995.
    • (1995) IEEE Trans. Neural Netw. , vol.6 , Issue.1 , pp. 25-30
    • Chen, T.P.1    Chen, H.2    Liu, R.W.3
  • 6
    • 0031100287 scopus 로고    scopus 로고
    • Capabilities of a four-layered feedforward neural network: Four layers versus three
    • PII S1045922797017505
    • S. Tamura and M. Tateishi, "Capabilities of a four-layered feedforward neural network: Four layers versus three," IEEE Trans. Neural Netw., vol. 8, no. 2, pp. 251-255, Mar. 1997. (Pubitemid 127765120)
    • (1997) IEEE Transactions on Neural Networks , vol.8 , Issue.2 , pp. 251-255
    • Tamura, S.1    Tateishi, M.2
  • 7
    • 0027242791 scopus 로고
    • Backpropagation neural nets with one and two hidden layers
    • Jan.
    • J. de Villiers and E. Barnard, "Backpropagation neural nets with one and two hidden layers," IEEE Trans. Neural Netw., vol. 4, no. 1, pp. 136-141, Jan. 1993.
    • (1993) IEEE Trans. Neural Netw. , vol.4 , Issue.1 , pp. 136-141
    • De Villiers, J.1    Barnard, E.2
  • 8
    • 80053620414 scopus 로고    scopus 로고
    • Numerical assessment of metamodelling strategies in computationally intensive optimization
    • to be published
    • S. Razavi, B. A. Tolosn, and D. H. Burn, "Numerical assessment of metamodelling strategies in computationally intensive optimization," Environ. Modell. Softw., 2011, to be published.
    • (2011) Environ. Modell. Softw.
    • Razavi, S.1    Tolosn, B.A.2    Burn, D.H.3
  • 9
    • 34047174077 scopus 로고    scopus 로고
    • A fast and accurate online sequential learning algorithm for feedforward networks
    • DOI 10.1109/TNN.2006.880583
    • N. Y. Liang, G. B. Huang, P. Saratchandran, and N. Sundararajan, "A fast and accurate online sequential learning algorithm for feedforward networks," IEEE Trans. Neural Netw., vol. 17, no. 6, pp. 1411-1423, Nov. 2006. (Pubitemid 44824256)
    • (2006) IEEE Transactions on Neural Networks , vol.17 , Issue.6 , pp. 1411-1423
    • Liang, N.-Y.1    Huang, G.-B.2    Saratchandran, P.3    Sundararajan, N.4
  • 10
    • 69249225560 scopus 로고    scopus 로고
    • Neighborhood based modified backpropagation algorithm using adaptive learning parameters for training feedforward neural networks
    • Oct.
    • T. Kathirvalavakumar and S. J. Subavathi, "Neighborhood based modified backpropagation algorithm using adaptive learning parameters for training feedforward neural networks," Neurocomputing, vol. 72, nos. 16-18, pp. 3915-3921, Oct. 2009.
    • (2009) Neurocomputing , vol.72 , Issue.16-18 , pp. 3915-3921
    • Kathirvalavakumar, T.1    Subavathi, S.J.2
  • 11
    • 56349113344 scopus 로고    scopus 로고
    • Evolutionary algorithm for training compact single hidden layer feedforward neural networks
    • Hong Kong, Jun.
    • H. T. Huynh and Y. Won, "Evolutionary algorithm for training compact single hidden layer feedforward neural networks," in Proc. IEEE Int. Joint Conf. Neural Netw., vols. 1-8. Hong Kong, Jun. 2008, pp. 3028-3033.
    • (2008) Proc. IEEE Int. Joint Conf. Neural Netw. , vol.1-8 , pp. 3028-3033
    • Huynh, H.T.1    Won, Y.2
  • 12
    • 37849189858 scopus 로고    scopus 로고
    • Estimating the number of hidden neurons in a feedforward network using the singular value decomposition
    • DOI 10.1109/TNN.2006.880582
    • E. J. Teoh, K. C. Tan, and C. Xiang, "Estimating the number of hidden neurons in a feedforward network using the singular value decomposition," IEEE Trans. Neural Netw., vol. 17, no. 6, pp. 1623-1629, Nov. 2006. (Pubitemid 44824273)
    • (2006) IEEE Transactions on Neural Networks , vol.17 , Issue.6 , pp. 1623-1629
    • Teoh, E.J.1    Tan, K.C.2    Xiang, C.3
  • 13
    • 0031233348 scopus 로고    scopus 로고
    • Are artificial neural networks black boxes?
    • PII S1045922797052417
    • J. M. Benitez, J. L. Castro, and I. Requena, "Are artificial neural networks black boxes?" IEEE Trans. Neural Netw., vol. 8, no. 5, pp. 1156-1164, Sep. 1997. (Pubitemid 127763346)
    • (1997) IEEE Transactions on Neural Networks , vol.8 , Issue.5 , pp. 1156-1164
    • Benitez, J.M.1    Castro, J.L.2    Requena, I.3
  • 14
    • 0036129249 scopus 로고    scopus 로고
    • Interpretation of artificial neural networks by means of fuzzy rules
    • DOI 10.1109/72.977279, PII S1045922702003429
    • J. L. Castro, C. J. Mantas, and J. M. Benitez, "Interpretation of artificial neural networks by means of fuzzy rules," IEEE Trans. Neural Netw., vol. 13, no. 1, pp. 101-116, Jan. 2002. (Pubitemid 34236841)
    • (2002) IEEE Transactions on Neural Networks , vol.13 , Issue.1 , pp. 101-116
    • Castro, J.L.1    Mantas, C.J.2    Benitez, J.M.3
  • 15
    • 0032208720 scopus 로고    scopus 로고
    • The truth will come to light: Directions and challenges in extracting the knowledge embedded within trained artificial neural networks
    • PII S1045922798061839
    • A. B. Tickle, R. Andrews, M. Golea, and J. Diederich, "The truth will come to light: Directions and challenges in extracting the knowledge embedded within trained artificial neural networks," IEEE Trans. Neural Netw., vol. 9, no. 6, pp. 1057-1068, Nov. 1998. (Pubitemid 128742391)
    • (1998) IEEE Transactions on Neural Networks , vol.9 , Issue.6 , pp. 1057-1068
    • Tickle, A.B.1    Andrews, R.2    Golea, M.3    Diederich, J.4
  • 16
    • 0025536870 scopus 로고
    • Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights
    • San Diego, CA, Jun.
    • D. Nguyen and B. Widrow, "Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights," in Proc. Int. Joint Conf. Neural Netw., vol. 3. San Diego, CA, Jun. 1990, pp. 21-26.
    • (1990) Proc. Int. Joint Conf. Neural Netw. , vol.3 , pp. 21-26
    • Nguyen, D.1    Widrow, B.2
  • 17
    • 0035272407 scopus 로고    scopus 로고
    • Feedforward networks training speed enhancement by optimal initialization of the synaptic coefficients
    • DOI 10.1109/72.914538, PII S1045922701020483
    • J. Y. F. Yam and T. W. S. Chow, "Feedforward networks training speed enhancement by optimal initialization of the synaptic coefficients," IEEE Trans. Neural Netw., vol. 12, no. 2, pp. 430-434, Mar. 2001. (Pubitemid 32371498)
    • (2001) IEEE Transactions on Neural Networks , vol.12 , Issue.2 , pp. 430-434
    • Yam, J.Y.F.1    Chow, T.W.S.2
  • 18
    • 11244307933 scopus 로고    scopus 로고
    • Geometrical interpretation and architecture selection of MLP
    • DOI 10.1109/TNN.2004.836197
    • C. Xiang, S. Q. Ding, and T. H. Lee, "Geometrical interpretation and architecture selection of MLP," IEEE Trans. Neural Netw., vol. 16, no. 1, pp. 84-96, Jan. 2005. (Pubitemid 40241912)
    • (2005) IEEE Transactions on Neural Networks , vol.16 , Issue.1 , pp. 84-96
    • Xiang, C.1    Ding, S.Q.2    Lee, T.H.3
  • 19
    • 35548969323 scopus 로고    scopus 로고
    • Comparison of stochastic global optimization methods to estimate neural network weights
    • DOI 10.1007/s11063-007-9048-7
    • L. Hamm, B. W. Brorsen, and M. T. Hagan, "Comparison of stochastic global optimization methods to estimate neural network weights," Neural Process. Lett., vol. 26, no. 3, pp. 145-158, 2007. (Pubitemid 350005972)
    • (2007) Neural Processing Letters , vol.26 , Issue.3 , pp. 145-158
    • Hamm, L.1    Brorsen, B.W.2    Hagan, M.T.3
  • 20
    • 0022471098 scopus 로고
    • Learning representations by back-propagating errors
    • D. E. Rumelhart, G. E. Hinton, and R. J. Williams, "Learning representations by back-propagating errors," Nature, vol. 323, pp. 533-536, Oct. 1986. (Pubitemid 16025374)
    • (1986) Nature , vol.323 , Issue.6088 , pp. 533-536
    • Rumelhart, D.E.1    Hinton, G.E.2    Williams, R.J.3
  • 22
    • 0028543366 scopus 로고
    • Training feedforward networks with the Marquardt algorithm
    • Nov.
    • M. T. Hagan and M. B. Menhaj, "Training feedforward networks with the Marquardt algorithm," IEEE Trans. Neural Netw., vol. 5, no. 6, pp. 989-993, Nov. 1994.
    • (1994) IEEE Trans. Neural Netw. , vol.5 , Issue.6 , pp. 989-993
    • Hagan, M.T.1    Menhaj, M.B.2
  • 24
    • 0036740409 scopus 로고    scopus 로고
    • Neighborhood based Levenberg-Marquardt algorithm for neural network training
    • Sep.
    • G. Lera and M. Pinzolas, "Neighborhood based Levenberg-Marquardt algorithm for neural network training," IEEE Trans. Neural Netw., vol. 13, no. 5, pp. 1200-1203, Sep. 2002.
    • (2002) IEEE Trans. Neural Netw. , vol.13 , Issue.5 , pp. 1200-1203
    • Lera, G.1    Pinzolas, M.2
  • 25
    • 77949267043 scopus 로고    scopus 로고
    • A heuristically enhanced gradient approximation (HEGA) algorithm for training neural networks
    • Mar.
    • D. Panagiotopoulos, C. Orovas, and D. Syndoukas, "A heuristically enhanced gradient approximation (HEGA) algorithm for training neural networks," Neurocomputing, vol. 73, nos. 7-9, pp. 1303-1323, Mar. 2010.
    • (2010) Neurocomputing , vol.73 , Issue.7-9 , pp. 1303-1323
    • Panagiotopoulos, D.1    Orovas, C.2    Syndoukas, D.3
  • 26
    • 0742303651 scopus 로고    scopus 로고
    • Parallel nonlinear optimization techniques for training neural networks
    • Nov.
    • P. K. H. Phua and D. H. Ming, "Parallel nonlinear optimization techniques for training neural networks," IEEE Trans. Neural Netw., vol. 14, no. 6, pp. 1460-1468, Nov. 2003.
    • (2003) IEEE Trans. Neural Netw. , vol.14 , Issue.6 , pp. 1460-1468
    • Phua, P.K.H.1    Ming, D.H.2
  • 27
    • 0031626594 scopus 로고    scopus 로고
    • Beyond backpropagation: Using simulated annealing for global optimization for neural networks
    • R. S. Sexton, R. E. Dorsey, and J. D. Johnson, "Beyond backpropagation: Using simulated annealing for global optimization for neural networks," in Proc. Annu. Meet. Decis. Sci. Inst., vols. 1-3. 1997, pp. 346-348.
    • (1997) Proc. Annu. Meet. Decis. Sci. Inst. , vol.1-3 , pp. 346-348
    • Sexton, R.S.1    Dorsey, R.E.2    Johnson, J.D.3
  • 28
    • 33646733849 scopus 로고    scopus 로고
    • Optimized particle swarm optimization (OPSO) and its application to artificial neural network training
    • M. Meissner, M. Schmuker, and G. Schneider, "Optimized particle swarm optimization (OPSO) and its application to artificial neural network training," BMC Bioinf., vol. 7, no. 1, p. 125, 2006.
    • (2006) BMC Bioinf. , vol.7 , Issue.1 , pp. 125
    • Meissner, M.1    Schmuker, M.2    Schneider, G.3
  • 30
    • 0037276988 scopus 로고    scopus 로고
    • Tuning of the structure and parameters of a neural network using an improved genetic algorithm
    • Jan.
    • F. H. F. Leung, H. K. Lam, S. H. Ling, and P. K. S. Tam, "Tuning of the structure and parameters of a neural network using an improved genetic algorithm," IEEE Trans. Neural Netw., vol. 14, no. 1, pp. 79-88, Jan. 2003.
    • (2003) IEEE Trans. Neural Netw. , vol.14 , Issue.1 , pp. 79-88
    • Leung, F.H.F.1    Lam, H.K.2    Ling, S.H.3    Tam, P.K.S.4
  • 31
    • 0031999703 scopus 로고    scopus 로고
    • Toward global optimization of neural networks: A comparison of the genetic algorithm and backpropagation
    • PII S0167923697000407
    • R. S. Sexton, R. E. Dorsey, and J. D. Johnson, "Toward global optimization of neural networks: A comparison of the genetic algorithm and backpropagation," Decis. Support Syst., vol. 22, no. 2, pp. 171-185, Feb. 1998. (Pubitemid 128430975)
    • (1998) Decision Support Systems , vol.22 , Issue.2 , pp. 171-185
    • Sexton, R.S.1    Dorsey, R.E.2    Johnson, J.D.3
  • 32
    • 33144466864 scopus 로고    scopus 로고
    • Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm
    • DOI 10.1109/TNN.2005.860885
    • J.-T. Tsai, J.-H. Chou, and T.-K. Liu, "Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm," IEEE Trans. Neural Netw., vol. 17, no. 1, pp. 69-80, Jan. 2006. (Pubitemid 43263941)
    • (2006) IEEE Transactions on Neural Networks , vol.17 , Issue.1 , pp. 69-80
    • Tsai, J.-T.1    Chou, J.-H.2    Liu, T.-K.3
  • 33
    • 33847618256 scopus 로고    scopus 로고
    • Dynamically dimensioned search algorithm for computationally efficient watershed model calibration
    • Jan.
    • B. A. Tolson and C. A. Shoemaker, "Dynamically dimensioned search algorithm for computationally efficient watershed model calibration," Water Resour. Res., vol. 43, no. 1, pp. W01413-1-W01413-16, Jan. 2007.
    • (2007) Water Resour. Res. , vol.43 , Issue.1
    • Tolson, B.A.1    Shoemaker, C.A.2
  • 34
    • 0001025418 scopus 로고
    • Bayesian interpolation
    • May
    • D. J. C. Mackay, "Bayesian interpolation," Neural Comput., vol. 4, no. 3, pp. 415-447, May 1992.
    • (1992) Neural Comput. , vol.4 , Issue.3 , pp. 415-447
    • MacKay, D.J.C.1
  • 35
    • 0030702721 scopus 로고    scopus 로고
    • Gauss-Newton approximation to Bayesian learning
    • Houston, TX, Jun.
    • F. D. Foresee and M. T. Hagan, "Gauss-Newton approximation to Bayesian learning," in Proc. Int. Joint Conf. Neural Netw., vol. 3. Houston, TX, Jun. 1997, pp. 1930-1935.
    • (1997) Proc. Int. Joint Conf. Neural Netw. , vol.3 , pp. 1930-1935
    • Foresee, F.D.1    Hagan, M.T.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.