메뉴 건너뛰기




Volumn 13, Issue 6, 2002, Pages 1268-1284

Deterministic nonmonotone strategies for effective training of multilayer perceptrons

Author keywords

Adaptive learning rate algorithms; Backpropagation (BP) algorithm; Multilayer perceptrons (MLPs); Nonmonotone minimization; Unconstrained minimization

Indexed keywords

ADAPTIVE ALGORITHMS; BACKPROPAGATION; ERROR ANALYSIS; HEURISTIC METHODS; LEARNING ALGORITHMS;

EID: 0036859211     PISSN: 10459227     EISSN: None     Source Type: Journal    
DOI: 10.1109/TNN.2002.804225     Document Type: Article
Times cited : (31)

References (74)
  • 2
    • 84966275544 scopus 로고
    • Minimization of functions having Lipschitz-continuous first partial derivatives
    • L. Armijo, "Minimization of functions having Lipschitz-continuous first partial derivatives," Pacific J. Math., vol. 16, pp. 1-3, 1966.
    • (1966) Pacific J. Math. , vol.16 , pp. 1-3
    • Armijo, L.1
  • 3
    • 0000036872 scopus 로고    scopus 로고
    • How initial conditions affect generalization performance in large networks
    • Mar.
    • A. Atiya and C. Ji, "How initial conditions affect generalization performance in large networks," IEEE Trans. Neural Networks, vol. 8, pp. 448-451, Mar. 1997.
    • (1997) IEEE Trans. Neural Networks , vol.8 , pp. 448-451
    • Atiya, A.1    Ji, C.2
  • 4
    • 0001531895 scopus 로고
    • Two point step size gradient methods
    • J. Barzilai and J.M. Borwein, "Two point step size gradient methods," IMA J. Numer. Anal., vol. 8, pp. 141-148, 1988.
    • (1988) IMA J. Numer. Anal. , vol.8 , pp. 141-148
    • Barzilai, J.1    Borwein, J.M.2
  • 5
    • 0001209372 scopus 로고
    • Accelerated backpropagation learning: Two optimization methods
    • R. Battiti, "Accelerated backpropagation learning: Two optimization methods," Complex Syst., vol. 3, pp. 331-342, 1989.
    • (1989) Complex Syst. , vol.3 , pp. 331-342
    • Battiti, R.1
  • 6
    • 0001024110 scopus 로고
    • First- and second-order methods for learning: Between steepest descent and Newton's method
    • _, "First- and second-order methods for learning: Between steepest descent and Newton's method," Neural Comput., vol. 4, pp. 141-166, 1992.
    • (1992) Neural Comput. , vol.4 , pp. 141-166
  • 7
    • 0002906163 scopus 로고
    • Improving the convergence of the backpropagation learning with second order methods
    • D.S. Touretzky, G.E. Hinton, and T.J. Sejnowski, Eds. San Mateo, CA: Morgan Kaufmann
    • S. Becker and Y. Le Cun, "Improving the convergence of the backpropagation learning with second order methods," in Proc. 1988 Connectionist Models Summer School, D.S. Touretzky, G.E. Hinton, and T.J. Sejnowski, Eds. San Mateo, CA: Morgan Kaufmann, 1988, pp. 29-37.
    • (1988) Proc. 1988 Connectionist Models Summer School , pp. 29-37
    • Becker, S.1    Le Cun, Y.2
  • 8
    • 0001740650 scopus 로고
    • Training with noise is equivalent to Tikhonov regularization
    • C.M. Bishop, "Training with noise is equivalent to Tikhonov regularization," Neural Comput., vol. 7, pp. 108-116, 1995.
    • (1995) Neural Comput. , vol.7 , pp. 108-116
    • Bishop, C.M.1
  • 10
    • 0000478038 scopus 로고
    • Méthode générale pour la résolution des systèmes d'équations simultanées
    • A. Cauchy, "Méthode générale pour la résolution des systèmes d'équations simultanées," Comp. Rend. Acad. Sci. Paris, vol. 25, pp. 536-538, 1847.
    • (1847) Comp. Rend. Acad. Sci. Paris , vol.25 , pp. 536-538
    • Cauchy, A.1
  • 11
    • 0001031887 scopus 로고
    • An adaptive training algorithm for back-propagation networks
    • L.W. Chan and F. Fallside, "An adaptive training algorithm for back-propagation networks," Comput. Speech Language, vol. 2, pp. 205-218, 1987.
    • (1987) Comput. Speech Language , vol.2 , pp. 205-218
    • Chan, L.W.1    Fallside, F.2
  • 12
    • 0002663672 scopus 로고
    • Quasi-Newton methods, motivation and theory
    • J.E. Dennis and J.J. Moré, "Quasi-Newton methods, motivation and theory," SIAM Rev., vol. 19, pp. 46-89, 1977.
    • (1977) SIAM Rev. , vol.19 , pp. 46-89
    • Dennis, J.E.1    Moré, J.J.2
  • 15
    • 0003000735 scopus 로고
    • Faster-learning variations on back-propagation: An empirical study
    • D.S. Touretzky, G.E. Hinton, and T.J. Sejnowski, Eds. San Mateo, CA: Morgan Kaufmann
    • S.E. Fahlman, "Faster-learning variations on back-propagation: An empirical study," in Proc. 1988 Connectionist Models Summer School, D.S. Touretzky, G.E. Hinton, and T.J. Sejnowski, Eds. San Mateo, CA: Morgan Kaufmann, 1988, pp. 38-51.
    • (1988) Proc. 1988 Connectionist Models Summer School , pp. 38-51
    • Fahlman, S.E.1
  • 17
    • 0032123433 scopus 로고    scopus 로고
    • Statistical estimation of the number of hidden units for feedforward neural networks
    • O. Fujita, "Statistical estimation of the number of hidden units for feedforward neural networks," Neural Networks, vol. 11, pp. 851-859, 1998.
    • (1998) Neural Networks , vol.11 , pp. 851-859
    • Fujita, O.1
  • 18
    • 0002824920 scopus 로고
    • Global convergence properties of conjugate gradient methods for optimization
    • J.C. Gilbert and J. Nocedal, "Global convergence properties of conjugate gradient methods for optimization," SIAM J. Optimization, vol. 2, pp. 21-42, 1992.
    • (1992) SIAM J. Optimization , vol.2 , pp. 21-42
    • Gilbert, J.C.1    Nocedal, J.2
  • 20
    • 0011854355 scopus 로고
    • Canchy's method of minimization
    • A.A. Goldstein, "Canchy's method of minimization," Numer. Math., vol. 4, pp. 146-150, 1962.
    • (1962) Numer. Math. , vol.4 , pp. 146-150
    • Goldstein, A.A.1
  • 21
    • 0022766519 scopus 로고
    • A nonmonotone line search technique for Newton's method
    • L. Grippo, F. Lampariello, and S. Lucidi, "A nonmonotone line search technique for Newton's method," SIAM J. Numer. Anal., vol. 23, pp. 707-716, 1986.
    • (1986) SIAM J. Numer. Anal. , vol.23 , pp. 707-716
    • Grippo, L.1    Lampariello, F.2    Lucidi, S.3
  • 22
    • 0032144712 scopus 로고    scopus 로고
    • Weight decay backpropagation for noisy data
    • A. Gupta and S.M. Lam, "Weight decay backpropagation for noisy data," Neural Networks, vol. 11, pp. 1127-1137, 1998.
    • (1998) Neural Networks , vol.11 , pp. 1127-1137
    • Gupta, A.1    Lam, S.M.2
  • 25
    • 0018466704 scopus 로고
    • Statistical and structural approaches to texture
    • R.M. Haralick, "Statistical and structural approaches to texture," Proc. IEEE, vol. 67, pp. 786-804, 1979.
    • (1979) Proc. IEEE , vol.67 , pp. 786-804
    • Haralick, R.M.1
  • 26
    • 0026624071 scopus 로고
    • Using additive noise in backpropagation training
    • Jan.
    • L. Holmstrom and P. Koistinen, "Using additive noise in backpropagation training," IEEE Trans. Neural Networks, vol. 3, pp. 24-38, Jan. 1992.
    • (1992) IEEE Trans. Neural Networks , vol.3 , pp. 24-38
    • Holmstrom, L.1    Koistinen, P.2
  • 27
    • 0029272024 scopus 로고
    • An adaptive training algorithm for back-propagation neural networks
    • Apr.
    • H.C. Hsin, C.C. Li, M. Sun, and R.J. Sclabassi, "An adaptive training algorithm for back-propagation neural networks," IEEE Trans. Syst., Man, Cybern., vol. 25, pp. 512-514, Apr. 1995.
    • (1995) IEEE Trans. Syst., Man, Cybern. , vol.25 , pp. 512-514
    • Hsin, H.C.1    Li, C.C.2    Sun, M.3    Sclabassi, R.J.4
  • 28
    • 0024137490 scopus 로고
    • Increased rates of convergence through learning rate adaptation
    • R.A. Jacobs, "Increased rates of convergence through learning rate adaptation," Neural Networks, vol. 1, pp. 295-307, 1988.
    • (1988) Neural Networks , vol.1 , pp. 295-307
    • Jacobs, R.A.1
  • 30
    • 0035159672 scopus 로고    scopus 로고
    • Evaluation of textural feature extraction schemes for neural network-based interpretation of regions in medical images
    • Thessaloniki, Greece
    • S.A. Karkanis, G.D. Magoulas, D.K. Iakovidis, D.A. Karras, and D.E. Maronlis, "Evaluation of textural feature extraction schemes for neural network-based interpretation of regions in medical images," in Proc. IEEE Int. Conf. Image Processing (ICIP-2001), vol. 1, Thessaloniki, Greece, 2001, pp. 281-284.
    • (2001) Proc. IEEE Int. Conf. Image Processing (ICIP-2001) , vol.1 , pp. 281-284
    • Karkanis, S.A.1    Magoulas, G.D.2    Iakovidis, D.K.3    Karras, D.A.4    Maronlis, D.E.5
  • 31
    • 0033824429 scopus 로고    scopus 로고
    • Image recognition and neuronal networks: Intelligent systems for the improvement of imaging information
    • S. Karkanis, G.D. Magoulas, and N. Theofanous, "Image recognition and neuronal networks: Intelligent systems for the improvement of imaging information," Minimally Invasive Therapy Allied Technol., vol. 9, pp. 225-230, 2000.
    • (2000) Minimally Invasive Therapy Allied Technol. , vol.9 , pp. 225-230
    • Karkanis, S.1    Magoulas, G.D.2    Theofanous, N.3
  • 32
    • 0034271110 scopus 로고    scopus 로고
    • On overfitting, generalization, and randomly expanded training sets
    • Sept.
    • G.N. Karystinos and D.A. Pados, "On overfitting, generalization, and randomly expanded training sets," IEEE Trans. Neural Networks, vol. 11, pp. 1050-1057, Sept. 2000.
    • (2000) IEEE Trans. Neural Networks , vol.11 , pp. 1050-1057
    • Karystinos, G.N.1    Pados, D.A.2
  • 33
    • 0026222695 scopus 로고
    • Convergence of learning algorithms with constant learning rates
    • Mar.
    • C.M. Kuan and K. Hornik, "Convergence of learning algorithms with constant learning rates," IEEE Trans. Neural Networks, vol. 2, pp. 484-488, Mar. 1991.
    • (1991) IEEE Trans. Neural Networks , vol.2 , pp. 484-488
    • Kuan, C.M.1    Hornik, K.2
  • 35
    • 0003979410 scopus 로고    scopus 로고
    • What size neural network gives optimal generalization? Convergence properties of backpropagation
    • S. Lawrence, C.L. Giles, and A.C. Tsoi, "What size neural network gives optimal generalization? Convergence properties of backpropagation," Univ. Maryland Tech. Rep. CS-TR-3617, 1996.
    • (1996) Univ. Maryland Tech. Rep. , vol.CS-TR-3617
    • Lawrence, S.1    Giles, C.L.2    Tsoi, A.C.3
  • 36
    • 0001298583 scopus 로고
    • Automatic learning rate maximization by on-line estimation of the Hessian's eigenvectors
    • S.J. Hanson, J.D. Cowan, and C.L. Giles, Eds. San Mateo, CA: Morgan Kaufmann
    • Y. Le Cun, P.Y. Simard, and B.A. Pearlmutter, "Automatic learning rate maximization by on-line estimation of the Hessian's eigenvectors," in Advances in Neural Information Processing Systems 5, S.J. Hanson, J.D. Cowan, and C.L. Giles, Eds. San Mateo, CA: Morgan Kaufmann, 1993, pp. 156-163.
    • (1993) Advances in Neural Information Processing Systems , vol.5 , pp. 156-163
    • Le Cun, Y.1    Simard, P.Y.2    Pearlmutter, B.A.3
  • 37
    • 0027226690 scopus 로고
    • An analysis of premature saturation in backpropagation learning
    • Y. Lee, S.H. Oh, and M.W. Kim, "An analysis of premature saturation in backpropagation learning," Neural Networks, vol. 6, pp. 719-728, 1993.
    • (1993) Neural Networks , vol.6 , pp. 719-728
    • Lee, Y.1    Oh, S.H.2    Kim, M.W.3
  • 38
    • 0029507858 scopus 로고
    • A convergence analysis for neural networks with constant learning rates and nonstationary inputs
    • New Orleans
    • R. Liu, G. Dong, and X. Ling, "A convergence analysis for neural networks with constant learning rates and nonstationary inputs," in Proc. 34th Conf. Decision Contr., New Orleans, 1995, pp. 1278-1283.
    • (1995) Proc. 34th Conf. Decision Contr. , pp. 1278-1283
    • Liu, R.1    Dong, G.2    Ling, X.3
  • 40
    • 0011850070 scopus 로고    scopus 로고
    • Comparison study of textural descriptors for training neural network classifiers
    • Oct., to be published
    • G.D. Magonlas, S.A. Karkanis, D.A. Karras, and M.N. Vrahatis, "Comparison study of textural descriptors for training neural network classifiers," Int. J. Comput. Res., Oct. 2002, to be published.
    • (2002) Int. J. Comput. Res.
    • Magonlas, G.D.1    Karkanis, S.A.2    Karras, D.A.3    Vrahatis, M.N.4
  • 43
    • 0033209687 scopus 로고    scopus 로고
    • Improving the convergence of the back-propagation algorithm using learning rate adaptation methods
    • _, "Improving the convergence of the back-propagation algorithm using learning rate adaptation methods," Neural Computation, vol. 11, pp. 1769-1796, 1999.
    • (1999) Neural Computation , vol.11 , pp. 1769-1796
  • 45
    • 0026858102 scopus 로고
    • Noise injection into inputs in backpropagation learning
    • Mar.
    • K. Matsuoka, "Noise injection into inputs in backpropagation learning," IEEE Trans. Syst., Man, Cybern., vol. 22, pp. 436-440, Mar. 1992.
    • (1992) IEEE Trans. Syst., Man, Cybern. , vol.22 , pp. 436-440
    • Matsuoka, K.1
  • 46
    • 0027205884 scopus 로고
    • A scaled conjugate gradient algorithm for fast supervised learning
    • M.F. Møller, "A scaled conjugate gradient algorithm for fast supervised learning," Neural Networks, vol. 6, pp. 525-533, 1993.
    • (1993) Neural Networks , vol.6 , pp. 525-533
    • Møller, M.F.1
  • 47
    • 0003408496 scopus 로고
    • Univ. California, Dept. Inform. Comput. Sci., Irvine, CA
    • P.M. Murphy and D.W. Aha. (1994) UCI repository of machine learning databases. Univ. California, Dept. Inform. Comput. Sci., Irvine, CA. [Online]. Available: http://www.ics.uci.edu/mlearn/MLRepository.html.
    • (1994) UCI repository of machine learning databases
    • Murphy, P.M.1    Aha, D.W.2
  • 48
    • 0034856538 scopus 로고    scopus 로고
    • On complexity analysis of supervised MLP-learning for algorithmic comparisons
    • Washington, DC
    • E. Mizutani and S.E. Dreyfus, "On complexity analysis of supervised MLP-learning for algorithmic comparisons," in Proc. Int. Joint Conf. Neural Networks, Washington, DC, 2001, pp. 347-352.
    • (2001) Proc. Int. Joint Conf. Neural Networks , pp. 347-352
    • Mizutani, E.1    Dreyfus, S.E.2
  • 49
    • 0025536870 scopus 로고
    • Improving the learning speed of 2-layer neural network by choosing initial values of the adaptive weights
    • D. Nguyen and B. Widrow, "Improving the learning speed of 2-layer neural network by choosing initial values of the adaptive weights," in IEEE Proc. 1st Int. Joint Conf. Neural Networks, vol. 3, 1990, pp. 21-26.
    • (1990) IEEE Proc. 1st Int. Joint Conf. Neural Networks , vol.3 , pp. 21-26
    • Nguyen, D.1    Widrow, B.2
  • 50
    • 84972047841 scopus 로고
    • Theory of algorithms for unconstrained optimization
    • J. Nocedal, "Theory of algorithms for unconstrained optimization," Acta Numerica, vol. 1, pp. 199-242, 1992.
    • (1992) Acta Numerica , vol.1 , pp. 199-242
    • Nocedal, J.1
  • 51
    • 0026900840 scopus 로고
    • Performance evaluation for four classes of textural features
    • P.P. Ohanian and R.C. Dubes, "Performance evaluation for four classes of textural features," Pattern Recognition, vol. 25, pp. 819-833, 1992.
    • (1992) Pattern Recognition , vol.25 , pp. 819-833
    • Ohanian, P.P.1    Dubes, R.C.2
  • 53
    • 0023602770 scopus 로고
    • Optimal algorithms for adaptive networks: Second order back-propagation, second order direct propagation, and second order Hebbian learning
    • D.B. Parker, "Optimal algorithms for adaptive networks: Second order back-propagation, second order direct propagation, and second order Hebbian learning," in Proc. IEEE Int. Conf. Neural Networks, 1987, pp. 593-600.
    • (1987) Proc. IEEE Int. Conf. Neural Networks , pp. 593-600
    • Parker, D.B.1
  • 54
    • 0027816861 scopus 로고
    • Speeding-up backpropagation - A comparison of orthogonal techniques
    • Nagoya, Japan
    • M. Pfister and R. Rojas, "Speeding-up backpropagation-A comparison of orthogonal techniques," in Proc. Joint Conf. Neural Networks, Nagoya, Japan, 1993, pp. 517-523.
    • (1993) Proc. Joint Conf. Neural Networks , pp. 517-523
    • Pfister, M.1    Rojas, R.2
  • 55
    • 0011887558 scopus 로고    scopus 로고
    • Automatic adaptation of learning rate for backpropagation neural networks
    • N.E. Mastorakis, Ed. Singapore: World Scientific
    • V.P. Plagianakos, D.G. Sotiropoulos, and M.N. Vrahatis, "Automatic adaptation of learning rate for backpropagation neural networks," in Recent Advances in Circuits and Systems, N.E. Mastorakis, Ed. Singapore: World Scientific, 1998, pp. 337-341.
    • (1998) Recent Advances in Circuits and Systems , pp. 337-341
    • Plagianakos, V.P.1    Sotiropoulos, D.G.2    Vrahatis, M.N.3
  • 56
    • 0033344770 scopus 로고    scopus 로고
    • Nonmonotone methods for backpropagation training with adaptive learning rate
    • Washington, DC
    • V.P. Plagianakos, M.N. Vrahatis, and G.D. Magoulas, "Nonmonotone methods for backpropagation training with adaptive learning rate," in Proc. IEEE Int. Joint Conf. Neural Networks, vol. 3, Washington, DC, 1999, pp. 1762-1767.
    • (1999) Proc. IEEE Int. Joint Conf. Neural Networks , vol.3 , pp. 1762-1767
    • Plagianakos, V.P.1    Vrahatis, M.N.2    Magoulas, G.D.3
  • 58
    • 0032099978 scopus 로고    scopus 로고
    • Automatic early stopping using cross validation: Quantifying the criteria
    • L. Prechelt, "Automatic early stopping using cross validation: Quantifying the criteria," Neural Networks, vol. 11, pp. 761-767, 1998.
    • (1998) Neural Networks , vol.11 , pp. 761-767
    • Prechelt, L.1
  • 59
    • 0031542191 scopus 로고    scopus 로고
    • The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem
    • M. Raydan, "The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem," SIAM J. Optimization, vol. 7, pp. 26-33, 1997.
    • (1997) SIAM J. Optimization , vol.7 , pp. 26-33
    • Raydan, M.1
  • 60
    • 84943274699 scopus 로고
    • A direct adaptive method for faster backpropagation learning: The Rprop algorithm
    • San Francisco, CA
    • M. Riedmiller and H. Braun, "A direct adaptive method for faster backpropagation learning: The Rprop algorithm," in Proc. IEEE Int. Conf. Neural Networks, San Francisco, CA, 1993, pp. 586-591.
    • (1993) Proc. IEEE Int. Conf. Neural Networks , pp. 586-591
    • Riedmiller, M.1    Braun, H.2
  • 61
    • 0025841422 scopus 로고
    • Rescaling of variables in backpropagation learning
    • A.K. Rigler, J.M. Irvine, and T.P. Vogl, "Rescaling of variables in backpropagation learning," Neural Networks, vol. 4, pp. 225-229, 1991.
    • (1991) Neural Networks , vol.4 , pp. 225-229
    • Rigler, A.K.1    Irvine, J.M.2    Vogl, T.P.3
  • 63
    • 84882420668 scopus 로고
    • Acceleration techniques for the back-propagation algorithm
    • Berlin, Germany: Springer-Verlag
    • F. Silva and L. Almeida, "Acceleration techniques for the back-propagation algorithm," in Lecture Notes in Computer Science. Berlin, Germany: Springer-Verlag, 1990, vol. 412, pp. 110-119.
    • (1990) Lecture Notes in Computer Science , vol.412 , pp. 110-119
    • Silva, F.1    Almeida, L.2
  • 64
    • 0027313792 scopus 로고
    • Speed up learning and network optimization with extended back-propagation
    • A. Sperduti and A. Starita, "Speed up learning and network optimization with extended back-propagation," Neural Networks, vol. 6, pp. 365-383, 1993.
    • (1993) Neural Networks , vol.6 , pp. 365-383
    • Sperduti, A.1    Starita, A.2
  • 65
    • 0028517634 scopus 로고
    • Local frequency features for texture classification
    • J. Strang and T. Taxt, "Local frequency features for texture classification," Pattern Recognition, vol. 27, pp. 1397-1406, 1994.
    • (1994) Pattern Recognition , vol.27 , pp. 1397-1406
    • Strang, J.1    Taxt, T.2
  • 67
    • 0033332525 scopus 로고    scopus 로고
    • Exploring constructive cascade networks
    • Nov.
    • N.K. Treadgold and T.D. Gedeon, "Exploring constructive cascade networks," IEEE Trans. Neural Networks, vol. 10, pp. 1335-1350, Nov. 1999.
    • (1999) IEEE Trans. Neural Networks , vol.10 , pp. 1335-1350
    • Treadgold, N.K.1    Gedeon, T.D.2
  • 68
    • 0028292832 scopus 로고
    • Minimization methods for training feedforward neural networks
    • P.P. Van der Smagt, "Minimization methods for training feedforward neural networks," Neural Networks, vol. 7, pp. 1-11, 1994.
    • (1994) Neural Networks , vol.7 , pp. 1-11
    • Van der Smagt, P.P.1
  • 69
    • 0033899834 scopus 로고    scopus 로고
    • A class of gradient unconstrained minimization algorithms with adaptive stepsize
    • M.N. Vrahatis, G.S. Androulakis, J.N. Lambrinos, and G.D. Magoulas, "A class of gradient unconstrained minimization algorithms with adaptive stepsize," J. Comput. Appl. Math., vol. 114, pp. 367-386, 2000.
    • (2000) J. Comput. Appl. Math. , vol.114 , pp. 367-386
    • Vrahatis, M.N.1    Androulakis, G.S.2    Lambrinos, J.N.3    Magoulas, G.D.4
  • 71
    • 34250094997 scopus 로고
    • Accelerating the convergence of the back-propagation method
    • T.P. Vogl, J.K. Mangis, J.K. Rigler, W.T. Zink, and D.L. Alkon, "Accelerating the convergence of the back-propagation method," Biol. Cybern., vol. 59, pp. 257-263, 1988.
    • (1988) Biol. Cybern. , vol.59 , pp. 257-263
    • Vogl, T.P.1    Mangis, J.K.2    Rigler, J.K.3    Zink, W.T.4    Alkon, D.L.5
  • 72
    • 0023541050 scopus 로고
    • Learning algorithms for connectionist networks: Applied gradient of nonlinear optimization
    • R.L. Watrous, "Learning algorithms for connectionist networks: Applied gradient of nonlinear optimization," in Proc. IEEE Int. Conf. Neural Networks, vol. 2, 1987, pp. 619-627.
    • (1987) Proc. IEEE Int. Conf. Neural Networks , vol.2 , pp. 619-627
    • Watrous, R.L.1
  • 73
    • 84889237353 scopus 로고    scopus 로고
    • Tumor recognition in endoscopic video images using artificial neural network architectures
    • F. Vajda, Ed. Los Alamitos, CA: IEEE Press
    • S.A. Karkanis, D.K. Iakovidis, D.E. Maroulis, G.D. Magoulas, and N.G. Theofanous, "Tumor recognition in endoscopic video images using artificial neural network architectures," in Proc. 26th Euromicro Conf., F. Vajda, Ed. Los Alamitos, CA: IEEE Press, 2000.
    • (2000) Proc. 26th Euromicro Conf.
    • Karkanis, S.A.1    Iakovidis, D.K.2    Maroulis, D.E.3    Magoulas, G.D.4    Theofanous, N.G.5


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.