메뉴 건너뛰기




Volumn 19, Issue 8, 2008, Pages 1415-1430

The Q-norm complexity measure and the minimum gradient method: A novel approach to the machine learning structural risk minimization problem

Author keywords

Complexity measure; Multiobjective training algorithms; Neural networks; Parallel layer perceptron (PLP); Regularization methods; Structural risk minimization (SRM)

Indexed keywords

ARTIFICIAL INTELLIGENCE; CHLORINE COMPOUNDS; EDUCATION; GRADIENT METHODS; LEARNING SYSTEMS; NEURAL NETWORKS; RISK ASSESSMENT; ROBOT LEARNING;

EID: 49649091536     PISSN: 10459227     EISSN: None     Source Type: Journal    
DOI: 10.1109/TNN.2008.2000442     Document Type: Article
Times cited : (25)

References (68)
  • 2
    • 34249753618 scopus 로고
    • Support vector networks
    • C. Cortes and V. Vapnik, "Support vector networks," Mach. Learn. vol. 20, pp. 273-279, 1995.
    • (1995) Mach. Learn , vol.20 , pp. 273-279
    • Cortes, C.1    Vapnik, V.2
  • 5
    • 0032166068 scopus 로고    scopus 로고
    • Structural risk minimization over data-dependent hierarchies
    • Sep
    • J. Shawe-Taylor and P. L. Bartlett, "Structural risk minimization over data-dependent hierarchies," IEEE Trans. Inf. Theory, vol. 44, no. 5, pp. 1926-1940, Sep. 1998.
    • (1998) IEEE Trans. Inf. Theory , vol.44 , Issue.5 , pp. 1926-1940
    • Shawe-Taylor, J.1    Bartlett, P.L.2
  • 6
    • 0024732792 scopus 로고
    • Connections learning procedures
    • G. E. Hinton, "Connections learning procedures," Artif. Intell., vol. 40, pp. 185-234, 1989.
    • (1989) Artif. Intell , vol.40 , pp. 185-234
    • Hinton, G.E.1
  • 7
    • 0000043665 scopus 로고
    • On solving ill-posed problem and the method of regularization
    • A. N. Tikhonov, "On solving ill-posed problem and the method of regularization," Doklady Akademii Nauk USSR, vol. 153, pp. 501-504, 1963.
    • (1963) Doklady Akademii Nauk USSR , vol.153 , pp. 501-504
    • Tikhonov, A.N.1
  • 10
    • 0034331142 scopus 로고    scopus 로고
    • Improving generalization of MLPs with multi-objective optimization
    • R. A. Teixiera, A. P. Braga, R. H. C. Takahashi, and R. R. Saldanha, "Improving generalization of MLPs with multi-objective optimization," Neurocomputing, vol. 35, no. 1-4, pp. 189-194, 2000.
    • (2000) Neurocomputing , vol.35 , Issue.1-4 , pp. 189-194
    • Teixiera, R.A.1    Braga, A.P.2    Takahashi, R.H.C.3    Saldanha, R.R.4
  • 11
    • 0037379787 scopus 로고    scopus 로고
    • Training neural networks with a multi-objective sliding mode control algorithm
    • M. A. Costa, A. P. Braga, B. R. Menezes, R. A. Teixiera, and G. G. Parma, "Training neural networks with a multi-objective sliding mode control algorithm," Neurocomputing, vol. 51, pp. 467-473, 2003.
    • (2003) Neurocomputing , vol.51 , pp. 467-473
    • Costa, M.A.1    Braga, A.P.2    Menezes, B.R.3    Teixiera, R.A.4    Parma, G.G.5
  • 13
    • 34249073096 scopus 로고    scopus 로고
    • Controlling the parallel layer perceptron complexity using a multiobjective learning algorithm
    • May, DOI: 10.1007/s00521-006-0052-z
    • D. A. G. Vieira, W. M. Caminhas, and J. A. Vasconcelos, "Controlling the parallel layer perceptron complexity using a multiobjective learning algorithm," Neural Comput. Appl., vol. 16, no. 4-5, May 2006, DOI: 10.1007/s00521-006-0052-z.
    • (2006) Neural Comput. Appl , vol.16 , Issue.4-5
    • Vieira, D.A.G.1    Caminhas, W.M.2    Vasconcelos, J.A.3
  • 14
    • 0001219859 scopus 로고
    • Regularization theory and neural networks architectures
    • F. Girosi, M. Jones, and T. Poggio, "Regularization theory and neural networks architectures," Neural Comput., vol. 7, pp. 219-269, 1995.
    • (1995) Neural Comput , vol.7 , pp. 219-269
    • Girosi, F.1    Jones, M.2    Poggio, T.3
  • 17
    • 0032028728 scopus 로고    scopus 로고
    • The sample complexity of pattern classification with neural networks: The size of the weights is more important than the size of the network
    • Mar
    • P. L. Bartlett, "The sample complexity of pattern classification with neural networks: The size of the weights is more important than the size of the network," IEEE Trans. Inf. Theory, vol. 44, no. 2, pp. 525-536, Mar. 1998.
    • (1998) IEEE Trans. Inf. Theory , vol.44 , Issue.2 , pp. 525-536
    • Bartlett, P.L.1
  • 18
    • 0040864988 scopus 로고
    • Principles of structural risk minimization for learning theory
    • Cambridge, MA: MIT Press
    • V. N. Vapnik, "Principles of structural risk minimization for learning theory," in Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 1992, vol. 4, pp. 831-838.
    • (1992) Advances in Neural Information Processing Systems , vol.4 , pp. 831-838
    • Vapnik, V.N.1
  • 19
    • 49649124543 scopus 로고    scopus 로고
    • D. E. Rumelhart, G. E. Hinton, and R. J. Williams, Learning Internal Representations by Error Propagation, ser. Parallel Distributed Processing: Explorations in the Microstructure of Cognition, D. E. Rumelhart and J. L. McClelland, Eds. Cambridge, MA: Bradford Books (MIT Press), 1986, 1.
    • D. E. Rumelhart, G. E. Hinton, and R. J. Williams, Learning Internal Representations by Error Propagation, ser. Parallel Distributed Processing: Explorations in the Microstructure of Cognition, D. E. Rumelhart and J. L. McClelland, Eds. Cambridge, MA: Bradford Books (MIT Press), 1986, vol. 1.
  • 20
    • 0024861871 scopus 로고
    • Approximation by superpositions of a Sigmoid function
    • G. Cybenko, "Approximation by superpositions of a Sigmoid function," Math. Control Signals Syst., vol. 2, pp. 303-314, 1989.
    • (1989) Math. Control Signals Syst , vol.2 , pp. 303-314
    • Cybenko, G.1
  • 21
    • 0024866495 scopus 로고
    • On the approximate realization of continuous mappings by neural networks
    • K. Funahashi, "On the approximate realization of continuous mappings by neural networks," Neural Netw. Signals Syst., vol. 2, pp. 183-192, 1989.
    • (1989) Neural Netw. Signals Syst , vol.2 , pp. 183-192
    • Funahashi, K.1
  • 22
    • 0003000735 scopus 로고
    • Fast-learning variations on back-propagation: An empirical study
    • D. Touretzky, G. Hinton, and T. Senjnowski, Eds, San Mateo, CA
    • S. E. Fahlman, "Fast-learning variations on back-propagation: An empirical study," in Proc. Connectionist Models Summer School, D. Touretzky, G. Hinton, and T. Senjnowski, Eds., San Mateo, CA, 1988, pp. 38-51.
    • (1988) Proc. Connectionist Models Summer School , pp. 38-51
    • Fahlman, S.E.1
  • 23
    • 0028543366 scopus 로고
    • Training feedforward network with the Marquardt algorithm
    • Nov
    • M. T. Hangan and M. B. Menjah, "Training feedforward network with the Marquardt algorithm," IEEE Trans. Neural Netw., vol. 5, no. 6, pp. 989-993, Nov. 1994.
    • (1994) IEEE Trans. Neural Netw , vol.5 , Issue.6 , pp. 989-993
    • Hangan, M.T.1    Menjah, M.B.2
  • 24
    • 84898957627 scopus 로고    scopus 로고
    • For valid generalization the size of the weights is more important than the size of the network
    • Cambridge, MA: MIT Press
    • P. L. Bartlett, "For valid generalization the size of the weights is more important than the size of the network," in Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 1997, vol. 9, pp. 134-141.
    • (1997) Advances in Neural Information Processing Systems , vol.9 , pp. 134-141
    • Bartlett, P.L.1
  • 25
    • 34250488412 scopus 로고
    • Relationship of several variational methods for approximate solutions of ill-posed problems
    • V. V. Vasin, "Relationship of several variational methods for approximate solutions of ill-posed problems," Math Notes, vol. 7, pp. 161-166, 1970.
    • (1970) Math Notes , vol.7 , pp. 161-166
    • Vasin, V.V.1
  • 26
    • 0000621802 scopus 로고
    • Multivariable functional interpolation and adaptive networks
    • D. S. Broomhead and D. Lowe, "Multivariable functional interpolation and adaptive networks," Complex Syst., vol. 2, pp. 321-355, 1989.
    • (1989) Complex Syst , vol.2 , pp. 321-355
    • Broomhead, D.S.1    Lowe, D.2
  • 27
    • 0000106040 scopus 로고
    • Universal approximation using radial-basis-function networks
    • J. Park and I. W. Sandberg, "Universal approximation using radial-basis-function networks," Neural Comput., vol. 3, pp. 246-257, 1991.
    • (1991) Neural Comput , vol.3 , pp. 246-257
    • Park, J.1    Sandberg, I.W.2
  • 28
    • 0027599793 scopus 로고
    • Universal approximation bounds for superpositions of a sigmoidal function
    • May
    • A. R. Barron, "Universal approximation bounds for superpositions of a sigmoidal function," IEEE Trans. Inf. Theory, vol. 39, no. 3, pp. 930-945, May 1993.
    • (1993) IEEE Trans. Inf. Theory , vol.39 , Issue.3 , pp. 930-945
    • Barron, A.R.1
  • 29
    • 0036530235 scopus 로고    scopus 로고
    • The equivalence of support vector machine and regularization neural networks
    • P. Andras, "The equivalence of support vector machine and regularization neural networks," Neural Process. Lett., vol. 15, no. 2, pp. 97-104, 2002.
    • (2002) Neural Process. Lett , vol.15 , Issue.2 , pp. 97-104
    • Andras, P.1
  • 30
    • 34347245533 scopus 로고    scopus 로고
    • Localized generalization error model and its application to architecture selection for radial basis function neural network
    • Sep
    • D. S. Yeung, W. W. Y. Ng, D.Wang, E. C. C. Tsang, and X.-Z. Wang, "Localized generalization error model and its application to architecture selection for radial basis function neural network," IEEE Trans. Neural Netw., vol. 18, no. 5, pp. 1294-1305, Sep. 2007.
    • (2007) IEEE Trans. Neural Netw , vol.18 , Issue.5 , pp. 1294-1305
    • Yeung, D.S.1    Ng, W.W.Y.2    Wang, D.3    Tsang, E.C.C.4    Wang, X.-Z.5
  • 31
    • 84899007626 scopus 로고    scopus 로고
    • Smoothing regularizers for projective basis function networks
    • M. C. Mozer, M. I. Jordan, and T. Petsche, Eds. Cambridge, MA: MIT Press
    • J. E. Moody and T. S. Rögnvaldsson, "Smoothing regularizers for projective basis function networks," in Advances in Neural Information Processing Systems, M. C. Mozer, M. I. Jordan, and T. Petsche, Eds. Cambridge, MA: MIT Press, 1997, vol. 9, pp. 585-605.
    • (1997) Advances in Neural Information Processing Systems , vol.9 , pp. 585-605
    • Moody, J.E.1    Rögnvaldsson, T.S.2
  • 32
    • 0026953305 scopus 로고
    • Improving generalization performance using double back-propagation
    • Nov
    • H. Drucker and Y. LeCun, "Improving generalization performance using double back-propagation," IEEE Trans. Neural Netw., vol. 3, no. 6, pp. 991-997, Nov. 1992.
    • (1992) IEEE Trans. Neural Netw , vol.3 , Issue.6 , pp. 991-997
    • Drucker, H.1    LeCun, Y.2
  • 33
    • 0027659357 scopus 로고
    • Curvature driven smoothing: A learning algorithm for feedforward networks
    • Sep
    • C. M. Bishop, "Curvature driven smoothing: A learning algorithm for feedforward networks," IEEE Trans. Neural Netw., vol. 4, no. 5, pp. 882-884, Sep. 1993.
    • (1993) IEEE Trans. Neural Netw , vol.4 , Issue.5 , pp. 882-884
    • Bishop, C.M.1
  • 34
    • 2342479956 scopus 로고    scopus 로고
    • Extracting sensitivity information of electromagnetic devices models from a modified ANFIS topology
    • Mar
    • D. A. G. Vieira, W. M. Caminhas, and J. A. Vasconcelos, "Extracting sensitivity information of electromagnetic devices models from a modified ANFIS topology," IEEE Trans. Magn., vol. 40, no. 2, pp. 1180-1183, Mar. 2004.
    • (2004) IEEE Trans. Magn , vol.40 , Issue.2 , pp. 1180-1183
    • Vieira, D.A.G.1    Caminhas, W.M.2    Vasconcelos, J.A.3
  • 35
    • 0003994186 scopus 로고    scopus 로고
    • Matlab Toolboxes, Natick, MA [Online, Available
    • Matlab Toolboxes. The Mathworks, Natick, MA [Online]. Available: www.mathworks.com
    • The Mathworks
  • 36
    • 34248670883 scopus 로고    scopus 로고
    • Convex approach to validation-based learning of the regularization constant
    • May
    • K. Pelckmans, J. A. K. Suykens, and B. De Moor, "Convex approach to validation-based learning of the regularization constant," IEEE Trans. Neural Netw., vol. 18, no. 3, pp. 917-920, May 2007.
    • (2007) IEEE Trans. Neural Netw , vol.18 , Issue.3 , pp. 917-920
    • Pelckmans, K.1    Suykens, J.A.K.2    De Moor, B.3
  • 37
    • 0000475482 scopus 로고
    • A universal nonlinear filter, predictor and simulator which optimizes itself by a learning process
    • D. Gabor, W. Wildes, and R. Woodcock, "A universal nonlinear filter, predictor and simulator which optimizes itself by a learning process," Proc. Inst. Electr. Eng., vol. 108B, pp. 422-438, 1961.
    • (1961) Proc. Inst. Electr. Eng , vol.108 B , pp. 422-438
    • Gabor, D.1    Wildes, W.2    Woodcock, R.3
  • 38
    • 39549089137 scopus 로고    scopus 로고
    • A generalized least absolute deviation method for parameter estimation of autoregressive signals
    • Jan
    • Y. Xia and M. S. Kamel, "A generalized least absolute deviation method for parameter estimation of autoregressive signals," IEEE Trans. Neural Netw., vol. 19, no. 1, pp. 107-118, Jan. 2008.
    • (2008) IEEE Trans. Neural Netw , vol.19 , Issue.1 , pp. 107-118
    • Xia, Y.1    Kamel, M.S.2
  • 39
    • 84898957872 scopus 로고    scopus 로고
    • Improving the accuracy and speed of support vector machines
    • M. C. Mozer, M. I. Jordan, and T. Petsche, Eds. Cambridge, MA: MIT Press
    • C. J. C. Burges and B. Schölkopf, "Improving the accuracy and speed of support vector machines," in Advances in Neural Information Processing Systems, M. C. Mozer, M. I. Jordan, and T. Petsche, Eds. Cambridge, MA: MIT Press, 1997, vol. 9, p. 375.
    • (1997) Advances in Neural Information Processing Systems , vol.9 , pp. 375
    • Burges, C.J.C.1    Schölkopf, B.2
  • 40
    • 49649124782 scopus 로고    scopus 로고
    • IDA Benchmark Repository Used in Several Boosting, KFD and SVM Papers,
    • IDA, Tech. Rep, Online, Available
    • IDA, "IDA Benchmark Repository Used in Several Boosting, KFD and SVM Papers," Tech. Rep. [Online]. Available: http://ida.first.gmd.de/ raetsch/data/benchmarks.htm
  • 41
    • 0035272287 scopus 로고    scopus 로고
    • An introduction to Kernel-based learning algorithms
    • Mar
    • K. Muller, S. Mika, G. Ratsh, K. Tsuda, and B. Scholkopf, "An introduction to Kernel-based learning algorithms," IEEE Trans. Neural Netw., vol. 12, no. 2, pp. 181-201, Mar. 2001.
    • (2001) IEEE Trans. Neural Netw , vol.12 , Issue.2 , pp. 181-201
    • Muller, K.1    Mika, S.2    Ratsh, G.3    Tsuda, K.4    Scholkopf, B.5
  • 42
    • 0005035923 scopus 로고    scopus 로고
    • Adaptive margin support vector machines
    • A. Smola, P. Bartlett, B. Scholkopf, and D. Schuurmans, Eds. Cambridge, MA: MIT Press
    • J. Weston and R. Herbrich, "Adaptive margin support vector machines," in Advances in Large Margin Classifiers, A. Smola, P. Bartlett, B. Scholkopf, and D. Schuurmans, Eds. Cambridge, MA: MIT Press, 2000, pp. 281-295.
    • (2000) Advances in Large Margin Classifiers , pp. 281-295
    • Weston, J.1    Herbrich, R.2
  • 43
    • 0141639615 scopus 로고    scopus 로고
    • Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers
    • November
    • G. C. Cawley and N. L. C. Talbot, "Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers," Pattern Recognit., vol. 36, pp. 2585-2592, November 2003.
    • (2003) Pattern Recognit , vol.36 , pp. 2585-2592
    • Cawley, G.C.1    Talbot, N.L.C.2
  • 44
    • 1242263806 scopus 로고    scopus 로고
    • The generalized LASSO
    • Jan
    • V. Roth, "The generalized LASSO," IEEE Trans. Neural Netw., vol. 15, no. 1, pp. 16-28, Jan. 2004.
    • (2004) IEEE Trans. Neural Netw , vol.15 , Issue.1 , pp. 16-28
    • Roth, V.1
  • 45
    • 15844420667 scopus 로고    scopus 로고
    • The evidence framework applied to sparse kernel logistic regression
    • March
    • G. C. Cawley and N. L. C. Talbot, "The evidence framework applied to sparse kernel logistic regression," Neurocomputing, vol. 65, pp. 119-135, March 2005.
    • (2005) Neurocomputing , vol.65 , pp. 119-135
    • Cawley, G.C.1    Talbot, N.L.C.2
  • 46
    • 28244467848 scopus 로고    scopus 로고
    • Posterior probability support vector machines for unbalanced data
    • Nov
    • Q. Tao, G.-W. Wu, Fei-Yue, and J. Wang, "Posterior probability support vector machines for unbalanced data," IEEE Trans. Neural Netw., vol. 16, no. 6, pp. 1561-1573, Nov. 2005.
    • (2005) IEEE Trans. Neural Netw , vol.16 , Issue.6 , pp. 1561-1573
    • Tao, Q.1    Wu, G.-W.2    Yue, F.3    Wang, J.4
  • 47
    • 29644438050 scopus 로고    scopus 로고
    • Statistical comparisons of classifiers over multiple data sets
    • J. Demsar, "Statistical comparisons of classifiers over multiple data sets," J. Mach. Learn. Res., vol. 7, pp. 1-30, 2006.
    • (2006) J. Mach. Learn. Res , vol.7 , pp. 1-30
    • Demsar, J.1
  • 48
    • 49649104891 scopus 로고    scopus 로고
    • Is the maximal margin hyperplane special in a feature space? Hewlett-Packards Labs, Palo Alto, CA
    • Tech. Rep, Apr
    • B. Zhang, Is the maximal margin hyperplane special in a feature space? Hewlett-Packards Labs, Palo Alto, CA, Tech. Rep., Apr. 2001.
    • (2001)
    • Zhang, B.1
  • 50
    • 0032786569 scopus 로고    scopus 로고
    • Improving support vector machine classifier by modifying kernel functions
    • S. Amari and S. Wu, "Improving support vector machine classifier by modifying kernel functions," Neural Netw., vol. 12, pp. 783-789, 1999.
    • (1999) Neural Netw , vol.12 , pp. 783-789
    • Amari, S.1    Wu, S.2
  • 51
    • 0037382208 scopus 로고    scopus 로고
    • Evaluation of simple performance measures for tuning SVM hyperparameters
    • K. Duan, S. S. Keerthi, and A. N. Poo, "Evaluation of simple performance measures for tuning SVM hyperparameters," Neurocomputing vol. 51, pp. 41-59, 2003.
    • (2003) Neurocomputing , vol.51 , pp. 41-59
    • Duan, K.1    Keerthi, S.S.2    Poo, A.N.3
  • 52
    • 2142643698 scopus 로고    scopus 로고
    • A support vector machine with a hybrid kernel and minimal Vapnik-Chervonenks dimension
    • Apr
    • Y. Tan and J. Wuang, "A support vector machine with a hybrid kernel and minimal Vapnik-Chervonenks dimension," IEEE Trans. Knowl. Data Eng. vol. 16, no. 4, pp. 385-395, Apr. 2004.
    • (2004) IEEE Trans. Knowl. Data Eng , vol.16 , Issue.4 , pp. 385-395
    • Tan, Y.1    Wuang, J.2
  • 54
    • 0034861805 scopus 로고    scopus 로고
    • Kernel-based methods and function approximation
    • Washington, DC
    • G. Baudat and F. Anouar, "Kernel-based methods and function approximation," in Proc. Int. Joint Conf. Neural Netw., Washington, DC, 2001, pp. 1244-1249.
    • (2001) Proc. Int. Joint Conf. Neural Netw , pp. 1244-1249
    • Baudat, G.1    Anouar, F.2
  • 55
    • 17744373086 scopus 로고    scopus 로고
    • Evolutionary radial basis functions for credit assessment
    • E. Lacerda, A. Carvalho, A. P. Braga, and T. B. Ludermir, "Evolutionary radial basis functions for credit assessment," Appl. Intell., vol. 22, no. 3, pp. 167-182, 2005.
    • (2005) Appl. Intell , vol.22 , Issue.3 , pp. 167-182
    • Lacerda, E.1    Carvalho, A.2    Braga, A.P.3    Ludermir, T.B.4
  • 56
    • 0020102027 scopus 로고
    • Least square quantization in PCM
    • Mar
    • S. P. Llyod, "Least square quantization in PCM," IEEE Trans. Inf. Theory, vol. IT-28, no. 2, pp. 129-137, Mar. 1982.
    • (1982) IEEE Trans. Inf. Theory , vol.IT-28 , Issue.2 , pp. 129-137
    • Llyod, S.P.1
  • 59
    • 0024475950 scopus 로고
    • Multidimensional data clustering utilizing hybrid strategies
    • M. A. Ismail and M. S. Kamel, "Multidimensional data clustering utilizing hybrid strategies," Pattern Recognit., vol. 22, pp. 75-89, 1989.
    • (1989) Pattern Recognit , vol.22 , pp. 75-89
    • Ismail, M.A.1    Kamel, M.S.2
  • 60
    • 0001457509 scopus 로고
    • Some methods for classification and analysis of multivariate observations
    • J. MacQueen, "Some methods for classification and analysis of multivariate observations," in Proc. 5th Berkeley Symp. Math, 1967, vol. 1, pp. 281-297.
    • (1967) Proc. 5th Berkeley Symp. Math , vol.1 , pp. 281-297
    • MacQueen, J.1
  • 61
    • 0029196051 scopus 로고
    • Optimal adaptive k-means algorithm with dynamic adjustment of learning rate
    • Jan
    • C. Chinrungrueng and C. H. Séquin, "Optimal adaptive k-means algorithm with dynamic adjustment of learning rate," IEEE Trans. Neural Netw., vol. 6, no. 1, pp. 157-169, Jan. 1995.
    • (1995) IEEE Trans. Neural Netw , vol.6 , Issue.1 , pp. 157-169
    • Chinrungrueng, C.1    Séquin, C.H.2
  • 63
    • 0000155950 scopus 로고
    • The cascade-correlation learning architecture
    • D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann
    • S. E. Fahlman and C. Lebiere, "The cascade-correlation learning architecture," in Advances in Neural Information Processing Systems D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1990, vol. 2, pp. 524-532.
    • (1990) Advances in Neural Information Processing Systems , vol.2 , pp. 524-532
    • Fahlman, S.E.1    Lebiere, C.2
  • 66
    • 0027601884 scopus 로고
    • ANFIS: Adaptive-network-based fuzzy inference systems
    • May
    • J. S. R. Jang, "ANFIS: Adaptive-network-based fuzzy inference systems," IEEE Trans. Syst. Man Cybern., vol. 23, no. 3, pp. 665-685, May 1993.
    • (1993) IEEE Trans. Syst. Man Cybern , vol.23 , Issue.3 , pp. 665-685
    • Jang, J.S.R.1
  • 67
    • 0000257826 scopus 로고
    • A neo fuzzy neuron and it applications to system identification and predictions to system behavior
    • Japan
    • T. Yamakwa, E. Uchino, T. Miki, and Kusanagi, "A neo fuzzy neuron and it applications to system identification and predictions to system behavior," in Proc. 2nd Int. Conf. Fuzzy Logic Neural Netw., Japan, 1992, pp. 477-483.
    • (1992) Proc. 2nd Int. Conf. Fuzzy Logic Neural Netw , pp. 477-483
    • Yamakwa, T.1    Uchino, E.2    Miki, T.3    Kusanagi4
  • 68
    • 0001942829 scopus 로고
    • Neural networks and the bias-variance dilemma
    • S. Geman, E. Bienenstock, and R. Doursat, "Neural networks and the bias-variance dilemma," Neural Comput., vol. 4, no. 1, pp. 1-58, 1992.
    • (1992) Neural Comput , vol.4 , Issue.1 , pp. 1-58
    • Geman, S.1    Bienenstock, E.2    Doursat, R.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.