메뉴 건너뛰기




Volumn 8, Issue , 2007, Pages 2017-2045

Very fast online learning of highly non linear problems

(1)  Chariatis, Aggelos a  

a NONE   (Greece)

Author keywords

Activation functions; Neural networks; Online training; Receptive fields; Selective attention

Indexed keywords

APPROXIMATION THEORY; BACKPROPAGATION; FEEDFORWARD NEURAL NETWORKS; FUNCTIONS; NONLINEAR ANALYSIS; ONLINE SYSTEMS;

EID: 34848877743     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (9)

References (48)
  • 1
    • 84949479246 scopus 로고    scopus 로고
    • On the surprising behavior of distance metrics in high dimensional spaces
    • J. Van den Bussche and V. Vianu, editors, Proceedings of the 8th International Conference on Database Theory ICDT, of, Springer
    • C. C. Aggarwal, A. Hinneburg, and D. A. Keim. On the surprising behavior of distance metrics in high dimensional spaces. In J. Van den Bussche and V. Vianu, editors, Proceedings of the 8th International Conference on Database Theory (ICDT), volume 1973 of Lecture Notes in Computer Science, pages 420-434. Springer, 2001.
    • (2001) Lecture Notes in Computer Science , vol.1973 , pp. 420-434
    • Aggarwal, C.C.1    Hinneburg, A.2    Keim, D.A.3
  • 2
    • 0000255165 scopus 로고    scopus 로고
    • Controlling hidden layer capacity through lateral connections
    • K. Agyepong and R. Kothari. Controlling hidden layer capacity through lateral connections. Neural Computation, 9(6):1381-1402, 1997.
    • (1997) Neural Computation , vol.9 , Issue.6 , pp. 1381-1402
    • Agyepong, K.1    Kothari, R.2
  • 4
    • 0005879625 scopus 로고    scopus 로고
    • On-line step size adaptation
    • RT07/97, INESC/IST, Rua Alves Redol 1000 Lisbon, Portugal
    • L. B. Almeida, T. Langlois, and J. D. Amaral. On-line step size adaptation. Technical Report INESC RT07/97, INESC/IST, Rua Alves Redol 1000 Lisbon, Portugal, 1997.
    • (1997) Technical Report INESC
    • Almeida, L.B.1    Langlois, T.2    Amaral, J.D.3
  • 5
    • 0000396062 scopus 로고    scopus 로고
    • Natural gradient works efficiently in learning
    • S. Amari. Natural gradient works efficiently in learning. Neural Computation, 10(2):251-276, 1998.
    • (1998) Neural Computation , vol.10 , Issue.2 , pp. 251-276
    • Amari, S.1
  • 7
    • 34848904500 scopus 로고
    • Exception learning by backpropagation: A new error function
    • P. Leong and M. Jabri, editors
    • P. Bakker. Exception learning by backpropagation: A new error function. In P. Leong and M. Jabri, editors, Proceedings of the 4th Australian Conference on Neural Networks, pages 118-121, 1993.
    • (1993) Proceedings of the 4th Australian Conference on Neural Networks , pp. 118-121
    • Bakker, P.1
  • 8
    • 0042147692 scopus 로고
    • Using the representation in a neural network's hidden layer for task-specific focus of attention
    • S. Baluja and D. Pomerleau. Using the representation in a neural network's hidden layer for task-specific focus of attention. In IJCAI, pages 133-141, 1995.
    • (1995) IJCAI , pp. 133-141
    • Baluja, S.1    Pomerleau, D.2
  • 9
    • 0003619255 scopus 로고    scopus 로고
    • Bias, variance, and arcing classifiers
    • Technical Report 460, Statistics Department, University of California
    • L. Breiman. Bias, variance, and arcing classifiers. Technical Report 460, Statistics Department, University of California, 1996.
    • (1996)
    • Breiman, L.1
  • 10
    • 0000621802 scopus 로고
    • Multivariate functional interpolation and adaptive networks
    • D. S. Broomhead and D. Lowe. Multivariate functional interpolation and adaptive networks. Complex Systems, 2(3):321-355, 1988.
    • (1988) Complex Systems , vol.2 , Issue.3 , pp. 321-355
    • Broomhead, D.S.1    Lowe, D.2
  • 12
    • 0011799992 scopus 로고
    • A better activation function for artificial neural networks
    • 93-8, The Institute for Systems Research, University of Maryland, College Park, MD
    • D. L. Elliott. A better activation function for artificial neural networks. Technical Report TR 93-8, The Institute for Systems Research, University of Maryland, College Park, MD, 1993.
    • (1993) Technical Report TR
    • Elliott, D.L.1
  • 13
    • 0002717824 scopus 로고    scopus 로고
    • Square unit augmented, radially extended, multilayer perceptrons
    • G. B. Orr and K. R. Müller, editors, Neural Networks: Tricks of the Trade, of, Springer
    • G. W. Flake. Square unit augmented, radially extended, multilayer perceptrons. In G. B. Orr and K. R. Müller, editors, Neural Networks: Tricks of the Trade, volume 1524 of Lecture Notes in Computer Science, pages 145-163. Springer, 1998.
    • (1998) Lecture Notes in Computer Science , vol.1524 , pp. 145-163
    • Flake, G.W.1
  • 14
    • 0000735587 scopus 로고
    • Technical note: First nearest neighbor classification on frey and slate's letter recognition problem
    • T. C. Fogarty. Technical note: First nearest neighbor classification on frey and slate's letter recognition problem. Machine Learning, 9(4):387-388, 1992.
    • (1992) Machine Learning , vol.9 , Issue.4 , pp. 387-388
    • Fogarty, T.C.1
  • 15
    • 0002978642 scopus 로고    scopus 로고
    • Experiments with a new boosting algorithm
    • Y. Freund and R. E. Schapire. Experiments with a new boosting algorithm. In ICML, pages 148-156, 1996.
    • (1996) ICML , pp. 148-156
    • Freund, Y.1    Schapire, R.E.2
  • 16
    • 0026120634 scopus 로고
    • Letter recognition using holland-style adaptive classifiers
    • P. W. Frey and D. J. Slate. Letter recognition using holland-style adaptive classifiers. Machine Learning, 6:161-182, 1991.
    • (1991) Machine Learning , vol.6 , pp. 161-182
    • Frey, P.W.1    Slate, D.J.2
  • 17
    • 0021518209 scopus 로고
    • Stochastic relaxation, gibbs distributions, and the bayesian restoration of images
    • S. Geman and D. Geman. Stochastic relaxation, gibbs distributions, and the bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, 6(6):721-741, 1984.
    • (1984) IEEE Transactions on Pattern Analysis and Machine Intelligence , vol.6 , Issue.6 , pp. 721-741
    • Geman, S.1    Geman, D.2
  • 18
    • 34848829803 scopus 로고    scopus 로고
    • Stable adaptive momentum for rapid online learning in nonlinear systems
    • J. R. Dorronsoro, editor, Proceedings of the International Conference on Artificial Neural Networks ICANN, of, Springer
    • T. Graepel and N. N. Schraudolph. Stable adaptive momentum for rapid online learning in nonlinear systems. In J. R. Dorronsoro, editor, Proceedings of the International Conference on Artificial Neural Networks (ICANN), volume 2415 of Lecture Notes in Computer Science, pages 450-455. Springer, 2002.
    • (2002) Lecture Notes in Computer Science , vol.2415 , pp. 450-455
    • Graepel, T.1    Schraudolph, N.N.2
  • 19
    • 0003996286 scopus 로고    scopus 로고
    • Multi-player residual advantage learning with general function approximation
    • Technical Report WL-TR-1065, Wright Laboratory, Wright-Patterson Air Force Base, OH 45433-6543
    • M. Harmon and L. Baird. Multi-player residual advantage learning with general function approximation. Technical Report WL-TR-1065, Wright Laboratory, Wright-Patterson Air Force Base, OH 45433-6543, 1996.
    • (1996)
    • Harmon, M.1    Baird, L.2
  • 21
    • 0025414877 scopus 로고
    • Learning algorithms for perceptrons using back propagation with selective updates
    • April
    • S. C. Huang and Y. F. Huang. Learning algorithms for perceptrons using back propagation with selective updates. IEEE Control Systems Magazine, pages 56-61, April 1990.
    • (1990) IEEE Control Systems Magazine , pp. 56-61
    • Huang, S.C.1    Huang, Y.F.2
  • 22
    • 0024137490 scopus 로고
    • Increased rates of convergence through learning rate adaptation
    • R. A. Jacobs. Increased rates of convergence through learning rate adaptation. Neural Networks, 1: 295-307, 1988.
    • (1988) Neural Networks , vol.1 , pp. 295-307
    • Jacobs, R.A.1
  • 23
    • 85077000723 scopus 로고    scopus 로고
    • R. Kothari and D. Ensley. Decision boundary and generalization performance of feed-forward networks with gaussian lateral connections. In S. K. Rogers, D. B. Fogel, J. C. Bezdek, and B. Bosacchi, editors, Applications and Science of Computational Intelligence, SPIE Proceedings, 3390, pages 314-321, 1998.
    • R. Kothari and D. Ensley. Decision boundary and generalization performance of feed-forward networks with gaussian lateral connections. In S. K. Rogers, D. B. Fogel, J. C. Bezdek, and B. Bosacchi, editors, Applications and Science of Computational Intelligence, SPIE Proceedings, volume 3390, pages 314-321, 1998.
  • 24
    • 0002449489 scopus 로고
    • Adaptive source separation with uniform performance
    • September
    • B. Laheld and J. F. Cardoso. Adaptive source separation with uniform performance. In Proc. EUSIPCO, pages 183-186, September 1994.
    • (1994) Proc. EUSIPCO , pp. 183-186
    • Laheld, B.1    Cardoso, J.F.2
  • 25
    • 0001298583 scopus 로고
    • Automatic learning rate maximization by on-line estimation of the hessian's eigenvectors
    • S. Hanson, J. Cowan, and L. Giles, editors, Morgan Kaufmann Publishers, San Mateo, CA
    • Y. LeCun, P. Simard, and B. Pearlmutter. Automatic learning rate maximization by on-line estimation of the hessian's eigenvectors. In S. Hanson, J. Cowan, and L. Giles, editors, Advances in Neural Information Processing Systems, volume 5, pages 156-163. Morgan Kaufmann Publishers, San Mateo, CA, 1993.
    • (1993) Advances in Neural Information Processing Systems , vol.5 , pp. 156-163
    • LeCun, Y.1    Simard, P.2    Pearlmutter, B.3
  • 26
    • 0001857994 scopus 로고    scopus 로고
    • Efficient backprop
    • G. B. Orr and K.-R. Müller, editors, Neural Networks: Tricks of the Trade, of, Springer
    • Y. LeCun, L. Bottou, G. B. Orr, and K.-R. Mueller. Efficient backprop. In G. B. Orr and K.-R. Müller, editors, Neural Networks: Tricks of the Trade, volume 1524 of Lecture Notes in Computer Science, pages 9-50. Springer, 1998.
    • (1998) Lecture Notes in Computer Science , vol.1524 , pp. 9-50
    • LeCun, Y.1    Bottou, L.2    Orr, G.B.3    Mueller, K.-R.4
  • 29
  • 31
    • 84898987060 scopus 로고    scopus 로고
    • Using curvature information for fast stochastic search
    • M. Mozer, M. I. Jordan, and T. Petsche, editors, MIT Press
    • G. B. Orr and T. K. Leen. Using curvature information for fast stochastic search. In M. Mozer, M. I. Jordan, and T. Petsche, editors, Advances in Neural Information Processing Systems 9 (NIPS), pages 606-612. MIT Press, 1996.
    • (1996) Advances in Neural Information Processing Systems 9 (NIPS) , pp. 606-612
    • Orr, G.B.1    Leen, T.K.2
  • 33
    • 0027287094 scopus 로고
    • A hebbian/anti-hebbian network which optimizes information capacity by orthonormalizing the principal subspace
    • M. Plumbley. A hebbian/anti-hebbian network which optimizes information capacity by orthonormalizing the principal subspace. In Proc. IEE Conf. on Artificial Neural Networks, Brighton, UK, pages 86-90, 1993.
    • (1993) Proc. IEE Conf. on Artificial Neural Networks, Brighton, UK , pp. 86-90
    • Plumbley, M.1
  • 34
    • 0029306953 scopus 로고
    • Similarities of error regularization, sigmoid gain scaling, target smoothing, and training with jitter
    • R. Reed, R.J. Marks, and S. Oh. Similarities of error regularization, sigmoid gain scaling, target smoothing, and training with jitter. IEEE Transactions on Neural Networks, 6(3):529-538, 1995.
    • (1995) IEEE Transactions on Neural Networks , vol.6 , Issue.3 , pp. 529-538
    • Reed, R.1    Marks, R.J.2    Oh, S.3
  • 37
    • 0036631778 scopus 로고    scopus 로고
    • Fast curvature matrix-vector products for second-order gradient descent
    • N. N. Schraudolph. Fast curvature matrix-vector products for second-order gradient descent. Neural Computation, 14(7): 1723-1738, 2002.
    • (2002) Neural Computation , vol.14 , Issue.7 , pp. 1723-1738
    • Schraudolph, N.N.1
  • 38
    • 0038231917 scopus 로고    scopus 로고
    • Centering neural network gradient factors
    • G. B. Orr and K. R. Müller, editors, Neural Networks: Tricks of the Trade, of, Springer
    • N. N. Schraudolph. Centering neural network gradient factors. In G. B. Orr and K. R. Müller, editors, Neural Networks: Tricks of the Trade, volume 1524 of Lecture Notes in Computer Science, pages 207-226. Springer, 1998a.
    • (1998) Lecture Notes in Computer Science , vol.1524 , pp. 207-226
    • Schraudolph, N.N.1
  • 39
    • 84893411823 scopus 로고    scopus 로고
    • Accelerated gradient descent by factor-centering decomposition
    • Technical Report IDSIA-33-98, Istituto Dalle Molle di Studi sull'Intelligenza Artificiale, 1998b
    • N. N. Schraudolph. Accelerated gradient descent by factor-centering decomposition. Technical Report IDSIA-33-98, Istituto Dalle Molle di Studi sull'Intelligenza Artificiale, 1998b.
    • Schraudolph, N.N.1
  • 40
    • 34848924916 scopus 로고    scopus 로고
    • N. N. Schraudolph. Online local gain adaptation for multi-layer perceptrons. Technical Report IDSIA-09-98, Istituto Dalle Molle di Studi sull'Intelligenza Artificiale, Galleria 2, CH-6928 Manno, Switzerland, 1998c.
    • N. N. Schraudolph. Online local gain adaptation for multi-layer perceptrons. Technical Report IDSIA-09-98, Istituto Dalle Molle di Studi sull'Intelligenza Artificiale, Galleria 2, CH-6928 Manno, Switzerland, 1998c.
  • 41
    • 0033338205 scopus 로고    scopus 로고
    • Local gain adaptation in stochastic gradient descent
    • IEE, London
    • N. N. Schraudolph. Local gain adaptation in stochastic gradient descent. In ICANN, pages 569-574. IEE, London, 1999.
    • (1999) ICANN , pp. 569-574
    • Schraudolph, N.N.1
  • 42
    • 0034243471 scopus 로고    scopus 로고
    • Boosting neural networks
    • H. Schwenk and Y. Bengio. Boosting neural networks. Neural Computation, 12(8): 1869-1887, 2000.
    • (2000) Neural Computation , vol.12 , Issue.8 , pp. 1869-1887
    • Schwenk, H.1    Bengio, Y.2
  • 43
    • 0001149381 scopus 로고    scopus 로고
    • Training methods for adaptive boosting of neural networks for character recognition
    • M. Jordan, M. Kearns, and S. Solla, editors, MIT Press, Cambridge, MA
    • H. Schwenk and Y. Bengio. Training methods for adaptive boosting of neural networks for character recognition. In M. Jordan, M. Kearns, and S. Solla, editors, Advances in Neural Information Processing Systems 10. MIT Press, Cambridge, MA, 1998.
    • (1998) Advances in Neural Information Processing Systems 10
    • Schwenk, H.1    Bengio, Y.2
  • 44
    • 2442608343 scopus 로고    scopus 로고
    • Neural coding strategies and mechanisms of competition
    • M. W. Spratling and M. H. Johnson. Neural coding strategies and mechanisms of competition. Cognitive Systems Research, 5(2):93-117, 2004.
    • (2004) Cognitive Systems Research , vol.5 , Issue.2 , pp. 93-117
    • Spratling, M.W.1    Johnson, M.H.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.