메뉴 건너뛰기




Volumn 11, Issue , 2010, Pages 61-87

Model selection: Beyond the bayesian/frequentist divide

Author keywords

Bayesian priors; Bias variance tradeoff; Ensemble methods; Guaranteed risk minimization; Model selection; Multilevel inference; Multilevel optimization; Over fitting; Performance prediction; Regularization; Structural risk minimization

Indexed keywords

BAYESIAN; ENSEMBLE METHODS; MODEL SELECTION; MULTILEVEL OPTIMIZATION; OVERFITTING; PERFORMANCE PREDICTION; RISK MINIMIZATION; STRUCTURAL RISK MINIMIZATION;

EID: 76749118521     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (152)

References (84)
  • 1
    • 76749084159 scopus 로고    scopus 로고
    • Unified framework for SVM model selection.
    • I. Guyon, et al., editor. Microtome
    • M. Adankon and M. Cheriet. Unified framework for SVM model selection. In I. Guyon, et al., editor, Hands on Pattern Recognition. Microtome, 2009.
    • (2009) Hands on Pattern Recognition
    • Adankon, M.1    Cheriet, M.2
  • 2
    • 0000501656 scopus 로고
    • Information theory and an extension of the maximum likelihood principle
    • B.N. Petrov and F. Csaki, editors. Akademia Kiado, Budapest
    • H. Akaike. Information theory and an extension of the maximum likelihood principle. In B.N. Petrov and F. Csaki, editors, 2nd International Symposium on Information Theory, pages 267- 281. Akademia Kiado, Budapest, 1973.
    • (1973) 2nd International Symposium on Information Theory , pp. 267-281
    • Akaike, H.1
  • 4
    • 76749147250 scopus 로고    scopus 로고
    • Virtual high-throughput screening with two-dimensional kernels
    • I. Guyon, et al., editor. Microtome
    • C.-A. Azencott and P. Baldi. Virtual high-throughput screening with two-dimensional kernels. In I. Guyon, et al., editor, Hands on Pattern Recognition. Microtome, 2009.
    • (2009) Hands on Pattern Recognition
    • Azencott, C.-A.1    Baldi, P.2
  • 5
    • 84898957627 scopus 로고    scopus 로고
    • For valid generalization the size of the weights is more important than the size of the network
    • M. C. Mozer, M. I. Jordan, and T. Petsche, editors, Cambridge, MA. MIT Press
    • P. L. Bartlett. For valid generalization the size of the weights is more important than the size of the network. In M. C. Mozer, M. I. Jordan, and T. Petsche, editors, Advances in Neural Information Processing Systems, volume 9, page 134, Cambridge, MA, 1997. MIT Press.
    • (1997) Advances in Neural Information Processing Systems , vol.9 , pp. 134
    • Bartlett, P.L.1
  • 7
    • 0031334221 scopus 로고    scopus 로고
    • Selection of relevant features and examples in machine learning
    • December
    • A. Blum and P. Langley. Selection of relevant features and examples in machine learning. Artificial Intelligence, 97(1-2):245-271, December 1997.
    • (1997) Artificial Intelligence , vol.97 , Issue.1-2 , pp. 245-271
    • Blum, A.1    Langley, P.2
  • 8
    • 0026966646 scopus 로고
    • A training algorithm for optimal margin classifiers
    • B. Boser, I. Guyon, and V. Vapnik. A training algorithm for optimal margin classifiers. In COLT, pages 144-152, 1992.
    • (1992) COLT , pp. 144-152
    • Boser, B.1    Guyon, I.2    Vapnik, V.3
  • 9
    • 34547688866 scopus 로고    scopus 로고
    • Compression-based averaging of selective naive bayes classifiers
    • I. Guyon and A. Saffari, editors, Jul
    • M. Boullé. Compression-based averaging of selective naive bayes classifiers. In I. Guyon and A. Saffari, editors, JMLR, Special Topic on Model Selection, volume 8, pages 1659-1685, Jul 2007. URL http://www.jmlr.org/papers/ volume8/boulle07a/boulle07a.pdf.
    • (2007) JMLR, Special Topic on Model Selection , vol.8 , pp. 1659-1685
    • Boullé, M.1
  • 10
    • 84877309694 scopus 로고    scopus 로고
    • Data grid models for preparation and modeling in supervised learning
    • I. Guyon, et al., editor. Microtome
    • M. Boullé. Data grid models for preparation and modeling in supervised learning. In I. Guyon, et al., editor, Hands on Pattern Recognition. Microtome, 2009.
    • (2009) Hands on Pattern Recognition
    • Boullé, M.1
  • 11
    • 0035478854 scopus 로고    scopus 로고
    • Random forests
    • L. Breiman. Random forests. Machine Learning, 45(1):5-32, 2001.
    • (2001) Machine Learning , vol.45 , Issue.1 , pp. 5-32
    • Breiman, L.1
  • 12
    • 0030211964 scopus 로고    scopus 로고
    • Bagging predictors
    • L. Breiman. Bagging predictors. Machine Learning, 24(2):123-140, 1996.
    • (1996) Machine Learning , vol.24 , Issue.2 , pp. 123-140
    • Breiman, L.1
  • 13
    • 40649116219 scopus 로고    scopus 로고
    • Leave-one-out cross-validation based model selection criteria for weighted ls-svms
    • G. Cawley. Leave-one-out cross-validation based model selection criteria for weighted ls-svms. In IJCNN, pages 1661-1668, 2006.
    • (2006) IJCNN , pp. 1661-1668
    • Cawley, G.1
  • 14
    • 76749110629 scopus 로고    scopus 로고
    • Over-fitting in model selection and subsequent selection bias in performance evaluation
    • G. Cawley and N. Talbot. Over-fitting in model selection and subsequent selection bias in performance evaluation. JMLR, submitted, 2009.
    • (2009) JMLR, Submitted
    • Cawley, G.1    Talbot, N.2
  • 15
    • 34247558132 scopus 로고    scopus 로고
    • Preventing over-fitting during model selection via Bayesian regularization of the hyper-parameters
    • I. Guyon and A. Saffari, editors, Apr
    • G. Cawley and N. Talbot. Preventing over-fitting during model selection via Bayesian regularization of the hyper-parameters. In I. Guyon and A. Saffari, editors, JMLR, Special Topic on Model Selection, volume 8, pages 841-861, Apr 2007a. URL http://www.jmlr.org/papers/volume8/cawley07a/cawley07a.pdf.
    • (2007) JMLR, Special Topic on Model Selection , vol.8 , pp. 841-861
    • Cawley, G.1    Talbot, N.2
  • 16
    • 51749102156 scopus 로고    scopus 로고
    • Agnostic learning versus prior knowledge in the design of kernel machines
    • Orlando, Florida, Aug. INNS/IEEE
    • G. C. Cawley and N. L. C. Talbot. Agnostic learning versus prior knowledge in the design of kernel machines. In Proc. IJCNN07, Orlando, Florida, Aug 2007b. INNS/IEEE.
    • (2007) Proc. IJCNN07
    • Cawley, G.C.1    Talbot, N.L.C.2
  • 19
    • 41549130258 scopus 로고    scopus 로고
    • An information criterion for variable selection in Support Vector Machines
    • I. Guyon and A. Saffari, editors. Mar
    • G. Claeskens, C. Croux, and J. Van Kerckhoven. An information criterion for variable selection in Support Vector Machines. In I. Guyon and A. Saffari, editors, JMLR, Special Topic on Model Selection, volume 9, pages 541-558, Mar 2008. URL http://www.jmlr.org/papers/volume9/claeskens08a/claeskens08a.pdf.
    • (2008) JMLR, Special Topic on Model Selection , vol.9 , pp. 541-558
    • Claeskens, G.1    Croux, C.2    Van Kerckhoven, J.3
  • 21
    • 84860145416 scopus 로고    scopus 로고
    • An improved Random Forests approach with application to the performance prediction challenge datasets
    • I. Guyon, et al., editor
    • C. Dahinden. An improved Random Forests approach with application to the performance prediction challenge datasets. In I. Guyon, et al., editor, Hands on Pattern Recognition. Microtome, 2009.
    • (2009) Hands on Pattern Recognition. Microtome
    • Dahinden, C.1
  • 22
    • 56349086986 scopus 로고    scopus 로고
    • Model selection in kernel based regression using the influence function
    • I. Guyon and A. Saffari, editors, Oct 2-8
    • M. Debruyne, M. Hubert, and J. Suykens. Model selection in kernel based regression using the influence function. In I. Guyon and A. Saffari, editors, JMLR, Special Topic on Model Selection, volume 9, pages 2377-2400, Oct 2-8. URL http://www.jmlr.org/papers/volume9/debruyne08a/debruyne08a.pdf.
    • JMLR, Special Topic on Model Selection , vol.9 , pp. 2377-2400
    • Debruyne, M.1    Hubert, M.2    Suykens, J.3
  • 24
    • 0035470889 scopus 로고    scopus 로고
    • Greedy function approximation: A gradient boosting machine
    • J. Friedman. Greedy function approximation: A gradient boosting machine. Annals of Statistics, 29:1189-1232, 2000.
    • (2000) Annals of Statistics , vol.29 , pp. 1189-1232
    • Friedman, J.1
  • 25
    • 0034164230 scopus 로고    scopus 로고
    • Additive logistic regression, a statistical view of boosting
    • J. Friedman, T. Hastie, and R. Tibshirani. Additive logistic regression, a statistical view of boosting. Annals of Statistics, 28:337374, 2000.
    • (2000) Annals of Statistics , vol.28 , pp. 337374
    • Friedman, J.1    Hastie, T.2    Tibshirani, R.3
  • 28
    • 36949010307 scopus 로고    scopus 로고
    • VC theory of large margin multi-category classifiers
    • I. Guyon and A. Saffari, editors, Nov
    • Y. Guermeur. VC theory of large margin multi-category classifiers. In I. Guyon and A. Saffari, editors, JMLR, Special Topic on Model Selection, volume 8, pages 2551-2594, Nov 2007. URL http://www.jmlr.org/papers/volume8/guermeur07a/ guermeur07a.pdf.
    • (2007) JMLR, Special Topic on Model Selection , vol.8 , pp. 2551-2594
    • Guermeur, Y.1
  • 29
    • 77955421711 scopus 로고    scopus 로고
    • A practical guide to model selection
    • J. Marie, editor. Springer, to appear
    • I. Guyon. A practical guide to model selection. In J. Marie, editor, Machine Learning Summer School. Springer, to appear, 2009.
    • (2009) Machine Learning Summer School
    • Guyon, I.1
  • 30
    • 33745891586 scopus 로고    scopus 로고
    • I. Guyon, S. Gunn, M. Nikravesh, and L. Zadeh, Editors. Studies in Fuzziness and Soft Computing. With data, results and sample code for the NIPS 2003 feature selection challenge. Physica-Verlag, Springer
    • I. Guyon, S. Gunn, M. Nikravesh, and L. Zadeh, Editors. Feature Extraction, Foundations and Applications. Studies in Fuzziness and Soft Computing. With data, results and sample code for the NIPS 2003 feature selection challenge. Physica-Verlag, Springer, 2006a. URL http: //clopinet.com/fextract-book/.
    • (2006) Feature Extraction, Foundations and Applications
  • 33
    • 77956412560 scopus 로고    scopus 로고
    • Design and analysis of the causation and prediction challenge
    • WCCI2008 workshop on causality, Hong Kong, June 3-4
    • I. Guyon, C. Aliferis, G. Cooper, A. Elisseeff, J.-P. Pellet, P. Spirtes, and A. Statnikov. Design and analysis of the causation and prediction challenge. In JMLR W&CP, volume 3, pages 1-33, WCCI2008 workshop on causality, Hong Kong, June 3-4 2008a. URL http://jmlr.csail.mit.edu/papers/ topic/causality.html.
    • (2008) JMLR W&CP , vol.3 , pp. 1-33
    • Guyon, I.1    Aliferis, C.2    Cooper, G.3    Elisseeff, A.4    Pellet, J.-P.5    Spirtes, P.6    Statnikov, A.7
  • 34
    • 40649109726 scopus 로고    scopus 로고
    • Analysis of the IJCNN 2007 agnostic learning vs. prior knowledge challenge
    • Orlando, Florida, March
    • I. Guyon, A. Saffari, G. Dror, and G. Cawley. Analysis of the IJCNN 2007 agnostic learning vs. prior knowledge challenge. In Neural Networks, volume 21, pages 544-550, Orlando, Florida, March 2008b.
    • (2008) Neural Networks , vol.21 , pp. 544-550
    • Guyon, I.1    Saffari, A.2    Dror, G.3    Cawley, G.4
  • 36
    • 79959453209 scopus 로고    scopus 로고
    • Analysis of the KDD cup 2009: Fast scoring on a large orange customer database
    • in press. JMLR W&CP
    • I. Guyon, V. Lemaire, M. Boullé, Gideon Dror, and David Vogel. Analysis of the KDD cup 2009: Fast scoring on a large orange customer database. In KDD cup 2009, in press, volume 8. JMLR W&CP, 2009b.
    • (2009) KDD Cup 2009 , vol.8
    • Guyon, I.1    Lemaire, V.2    Boullé, M.3    Dror, G.4    Vogel, D.5
  • 37
    • 61749103238 scopus 로고    scopus 로고
    • Particle swarm model selection
    • I. Guyon and A. Saffari, editors, Feb
    • L. E. Sucar H. J. Escalante, M. Montes. Particle swarm model selection. In I. Guyon and A. Saffari, editors, JMLR, Special Topic on Model Selection, volume 10, pages 405-440, Feb 2009. URL http://www.jmlr.org/papers/volume10/ escalante09a/escalante09a.pdf.
    • (2009) JMLR, Special Topic on Model Selection , vol.10 , pp. 405-440
    • Sucar, L.E.1    Escalante, H.J.2    Montes, M.3
  • 39
    • 84925605946 scopus 로고    scopus 로고
    • The entire regularization path for the support vector machine
    • T. Hastie, S. Rosset, R. Tibshirani, and J. Zhu. The entire regularization path for the support vector machine. JMLR, 5:1391-1415, 2004. URL http://jmlr.csail.mit.edu/papers/volume5/hastie04a/hastie04a.pdf.
    • (2004) JMLR , vol.5 , pp. 1391-1415
    • Hastie, T.1    Rosset, S.2    Tibshirani, R.3    Zhu, J.4
  • 40
    • 0028132501 scopus 로고
    • Bounds on the sample complexity of Bayesian learning using information theory and the vc dimension
    • ISSN 0885-6125
    • D. Haussler, M. Kearns, and R. Schapire. Bounds on the sample complexity of Bayesian learning using information theory and the vc dimension. Machine Learning, 14(1):83-113, 1994. ISSN 0885-6125.
    • (1994) Machine Learning , vol.14 , Issue.1 , pp. 83-113
    • Haussler, D.1    Kearns, M.2    Schapire, R.3
  • 41
    • 0002161961 scopus 로고
    • Application of ridge analysis to regression problems
    • A. E. Hoerl. Application of ridge analysis to regression problems. Chemical Engineering Progress, 58:54-59, 1962.
    • (1962) Chemical Engineering Progress , vol.58 , pp. 54-59
    • Hoerl, A.E.1
  • 42
    • 37749003643 scopus 로고    scopus 로고
    • A new probabilistic approach in rank regression with optimal Bayesian partitioning
    • I. Guyon and A. Saffari, editors, Dec
    • C. Hue and M. Boullé. A new probabilistic approach in rank regression with optimal Bayesian partitioning. In I. Guyon and A. Saffari, editors, JMLR, Special Topic on Model Selection, volume 8, pages 2727-2754, Dec 2007. URL http://www.jmlr.org/papers/volume8/hue07a/hue07a.pdf.
    • (2007) JMLR, Special Topic on Model Selection , vol.8 , pp. 2727-2754
    • Hue, C.1    Boullé, M.2
  • 43
    • 76749138783 scopus 로고    scopus 로고
    • Winning the KDD cup orange challenge with ensemble selection
    • IBM team, in press. JMLR W&CP
    • IBM team. Winning the KDD cup orange challenge with ensemble selection. In KDD cup 2009, in press, volume 8. JMLR W&CP, 2009.
    • KDD Cup 2009 , vol.8 , pp. 2009
  • 44
    • 0031381525 scopus 로고    scopus 로고
    • Wrappers for feature selection
    • December
    • R. Kohavi and G. John. Wrappers for feature selection. Artificial Intelligence, 97(1-2):273-324, December 1997.
    • (1997) Artificial Intelligence , vol.97 , Issue.1-2 , pp. 273-324
    • Kohavi, R.1    John, G.2
  • 45
    • 57249114552 scopus 로고    scopus 로고
    • Model selection for regression with continuous kernel functions using the modulus of continuity
    • I. Guyon and A. Saffari, editors, Nov
    • I. Koo and R. M. Kil. Model selection for regression with continuous kernel functions using the modulus of continuity. In I. Guyon and A. Saffari, editors, JMLR, Special Topic on Model Selection, volume 9, pages 2607-2633, Nov 2008. URL http://www.jmlr.org/papers/volume9/koo08b/koo08b.pdf.
    • (2008) JMLR, Special Topic on Model Selection , vol.9 , pp. 2607-2633
    • Koo, I.1    Kil, R.M.2
  • 46
    • 76749092993 scopus 로고    scopus 로고
    • Bilevel cross-validation-based model selection
    • I. Guyon, et al., editor. Microtome
    • G. Kunapuli, J.-S. Pang, and K. Bennett. Bilevel cross-validation-based model selection.In I. Guyon, et al., editor, Hands on Pattern Recognition. Microtome, 2009.
    • (2009) Hands on Pattern Recognition
    • Kunapuli, G.1    Pang, J.-S.2    Bennett, K.3
  • 47
    • 21844462365 scopus 로고    scopus 로고
    • Tutorial on practical prediction theory for classification
    • Mar
    • J. Langford. Tutorial on practical prediction theory for classification. JMLR, 6:273-306, Mar 2005. URL http://jmlr.csail.mit.edu/papers/volume6/ langford05a/langford05a.pdf.
    • (2005) JMLR , vol.6 , pp. 273-306
    • Langford, J.1
  • 49
    • 47349132736 scopus 로고    scopus 로고
    • Logitboost with trees applied to the WCCI 2006 performance prediction challenge datasets
    • Vancouver, Canada, July 2006. INNS/IEEE
    • R. W. Lutz. Logitboost with trees applied to the WCCI 2006 performance prediction challenge datasets. In Proc. IJCNN06, pages 2966-2969, Vancouver, Canada, July 2006. INNS/IEEE.
    • Proc. IJCNN06 , pp. 2966-2969
    • Lutz, R.W.1
  • 50
    • 0002704818 scopus 로고
    • A practical Bayesian framework for backpropagation networks
    • D. MacKay. A practical Bayesian framework for backpropagation networks. Neural Computation, 4:448-472, 1992.
    • (1992) Neural Computation , vol.4 , pp. 448-472
    • MacKay, D.1
  • 51
    • 84898982358 scopus 로고    scopus 로고
    • Co-validation: Using model disagreement to validate classification algorithms
    • O. Madani, D. M. Pennock, and G. W. Flake. Co-validation: Using model disagreement to validate classification algorithms. In NIPS, 2005.
    • (2005) NIPS
    • Madani, O.1    Pennock, D.M.2    Flake, G.W.3
  • 53
    • 33745867970 scopus 로고    scopus 로고
    • High dimensional classification with Bayesian neural networks and dirichlet diffusion trees
    • I. Guyon, et al., editor
    • R. Neal and J. Zhang. High dimensional classification with Bayesian neural networks and dirichlet diffusion trees. In I. Guyon, et al., editor, Feature Extraction, Foundations and Applications, 2006.
    • (2006) Feature Extraction, Foundations and Applications
    • Neal, R.1    Zhang, J.2
  • 54
    • 76749105067 scopus 로고    scopus 로고
    • Classification with random sets, boosting and distance-based clustering
    • I. Guyon, et al., editor. Microtome
    • V. Nikulin. Classification with random sets, boosting and distance-based clustering. In I. Guyon, et al., editor, Hands on Pattern Recognition. Microtome, 2009.
    • (2009) Hands on Pattern Recognition
    • Nikulin, V.1
  • 55
    • 0025056697 scopus 로고
    • Regularization algorithms for learning that are equivalent to multilayer networks
    • February
    • T. Poggio and F. Girosi. Regularization algorithms for learning that are equivalent to multilayer networks. Science, 247(4945):978-982, February 1990.
    • (1990) Science , vol.247 , Issue.4945 , pp. 978-982
    • Poggio, T.1    Girosi, F.2
  • 56
    • 33646473677 scopus 로고    scopus 로고
    • Invariances in kernel methods: From samples to objects
    • ISSN 0167-8655
    • A. Pozdnoukhov and S. Bengio. Invariances in kernel methods: From samples to objects. Pattern Recogn. Lett., 27(10):1087-1097, 2006. ISSN 0167-8655.
    • (2006) Pattern Recogn. Lett. , vol.27 , Issue.10 , pp. 1087-1097
    • Pozdnoukhov, A.1    Bengio, S.2
  • 57
    • 76749120817 scopus 로고    scopus 로고
    • Liknon feature selection: Behind the scenes
    • I. Guyon, et al., editor. Microtome
    • E. Pranckeviciene and R. Somorjai. Liknon feature selection: Behind the scenes. In I. Guyon, et al., editor, Hands on Pattern Recognition. Microtome, 2009.
    • (2009) Hands on Pattern Recognition
    • Pranckeviciene, E.1    Somorjai, R.2
  • 58
    • 51749119228 scopus 로고    scopus 로고
    • Model selection and assessment using cross-indexing
    • Orlando, Florida, Aug. INNS/IEEE
    • J. Reunanen. Model selection and assessment using cross-indexing. In Proc. IJCNN07, Orlando, Florida, Aug 2007. INNS/IEEE.
    • (2007) Proc. IJCNN07
    • Reunanen, J.1
  • 59
    • 76749089465 scopus 로고    scopus 로고
    • Sparse flexible and efficient modeling using L1 regularization
    • I. Guyon, et al., editor
    • S. Rosset and J. Zhu. Sparse, flexible and efficient modeling using L1 regularization. In I. Guyon, et al., editor, Feature Extraction, Foundations and Applications, 2006.
    • (2006) Feature Extraction, Foundations and Applications
    • Rosset, S.1    Zhu, J.2
  • 60
    • 12844274244 scopus 로고    scopus 로고
    • Boosting as a regularized path to a maximum margin classifier
    • S. Rosset, J. Zhu, and T. Hastie. Boosting as a regularized path to a maximum margin classifier. Journal of Machine Learning Research, 5:941-973, 2004.
    • (2004) Journal of Machine Learning Research , vol.5 , pp. 941-973
    • Rosset, S.1    Zhu, J.2    Hastie, T.3
  • 61
    • 76749114653 scopus 로고    scopus 로고
    • Hybrid learning using mixture models and artificial neural networks
    • I. Guyon, et al., editor. Microtome
    • M. Saeed. Hybrid learning using mixture models and artificial neural networks. In I. Guyon, et al., editor, Hands on Pattern Recognition. Microtome, 2009.
    • (2009) Hands on Pattern Recognition
    • Saeed, M.1
  • 62
    • 51749116036 scopus 로고    scopus 로고
    • Technical report Graz University of Technology and Clopinet May
    • A. Saffari and I. Guyon. Quick start guide for CLOP. Technical report, Graz University of Technology and Clopinet, May 2006. URL http://clopinet.com/ CLOP/.
    • (2006) Quick Start Guide for CLOP
    • Saffari, A.1    Guyon, I.2
  • 64
    • 0000120766 scopus 로고
    • Estimating the dimension of a model
    • G. Schwarz. Estimating the dimension of a model. The Annals of Statistics, 6(2):461-464, 1978.
    • (1978) The Annals of Statistics , vol.6 , Issue.2 , pp. 461-464
    • Schwarz, G.1
  • 65
    • 0041464774 scopus 로고    scopus 로고
    • PAC-Bayesian generalisation error bounds for Gaussian process classification
    • M. Seeger. PAC-Bayesian generalisation error bounds for Gaussian process classification. JMLR, 3:233-269, 2003. URL http://jmlr.csail.mit.edu/papers/ volume3/seeger02a/seeger02a.pdf.
    • (2003) JMLR , vol.3 , pp. 233-269
    • Seeger, M.1
  • 66
    • 44649181578 scopus 로고    scopus 로고
    • Bayesian inference and optimal design for the sparse linear model
    • ISSN 1533-7928
    • M. Seeger. Bayesian inference and optimal design for the sparse linear model. JMLR, 9:759-813, 2008. ISSN 1533-7928.
    • (2008) JMLR , vol.9 , pp. 759-813
    • Seeger, M.1
  • 67
    • 0003137923 scopus 로고
    • Efficient pattern recognition using a new transformation distance
    • S. J. Hanson, J. D. Cowan, and C. L. Giles, editors, San Mateo, CA. Morgan Kaufmann
    • P. Simard, Y. LeCun, and J. Denker. Efficient pattern recognition using a new transformation distance. In S. J. Hanson, J. D. Cowan, and C. L. Giles, editors, Advances in Neural Information Processing Systems 5, pages 50-58, San Mateo, CA, 1993. Morgan Kaufmann.
    • (1993) Advances in Neural Information Processing Systems 5 , pp. 50-58
    • Simard, P.1    Lecun, Y.2    Denker, J.3
  • 68
    • 84970983203 scopus 로고    scopus 로고
    • Unlabeled data: Now it helps, now it doesn't
    • A. Singh, R. Nowak, and X. Zhu. Unlabeled data: Now it helps, now it doesn't. In NIPS, 2008.
    • (2008) NIPS
    • Singh, A.1    Nowak, R.2    Zhu, X.3
  • 69
    • 0037845137 scopus 로고    scopus 로고
    • Regularized principal manifolds
    • A. Smola, S. Mika, B. Schölkopf, and R. Williamson. Regularized principal manifolds. JMLR, 1:179-209, 2001. URL http://jmlr.csail.mit.edu/ papers/volume1/smola01a/smola01a.pdf.
    • (2001) JMLR , vol.1 , pp. 179-209
    • Smola, A.1    Mika, S.2    Schölkopf, B.3    Williamson, R.4
  • 71
    • 68949154557 scopus 로고    scopus 로고
    • Feature selection with ensembles, artificial variables, and redundancy elimination
    • I. Guyon and A. Saffari, editors, Jul
    • E. Tuv, A. Borisov, G. Runger, and K. Torkkola. Feature selection with ensembles, artificial variables, and redundancy elimination. In I. Guyon and A. Saffari, editors, JMLR, Special Topic on Model Selection, volume 10, pages 1341-1366, Jul 2009. URL http://www.jmlr.org/papers/volume10/tuv09a/tuv09a.pdf.
    • (2009) JMLR, Special Topic on Model Selection , vol.10 , pp. 1341-1366
    • Tuv, E.1    Borisov, A.2    Runger, G.3    Torkkola, K.4
  • 72
    • 0021518106 scopus 로고
    • A theory of the learnable
    • L. Valiant. A theory of the learnable. Communications of the ACM, 27(11):1134-1142, 1984.
    • (1984) Communications of the ACM , vol.27 , Issue.11 , pp. 1134-1142
    • Valiant, L.1
  • 75
    • 0001024505 scopus 로고
    • On the uniform convergence of relative frequencies of events to their probabilities
    • V. Vapnik and A. Chervonenkis. On the uniform convergence of relative frequencies of events to their probabilities. Theory Probab. Appl., 16:264-1180, 1971.
    • (1971) Theory Probab. Appl. , vol.16 , pp. 264-1180
    • Vapnik, V.1    Chervonenkis, A.2
  • 76
    • 84898962121 scopus 로고    scopus 로고
    • Fast kernels for string and tree matching
    • MIT Press
    • S. Vishwanathan and A. Smola. Fast kernels for string and tree matching. In Advances in Neural Information Processing Systems 15, pages 569-576. MIT Press, 2003. URL http://books.nips.cc/papers/files/nips15/AA11.pdf.
    • (2003) Advances in Neural Information Processing Systems , vol.15 , pp. 569-576
    • Vishwanathan, S.1    Smola, A.2
  • 78
    • 0003216062 scopus 로고
    • Consonant recognition by modular construction of large phonemic time-delay neural networks
    • A. Waibel. Consonant recognition by modular construction of large phonemic time-delay neural networks. In NIPS, pages 215-223, 1988.
    • (1988) NIPS , pp. 215-223
    • Waibel, A.1
  • 79
    • 0002531715 scopus 로고    scopus 로고
    • Dynamic alignment kernels
    • A.J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, Cambridge, MA. MIT Press
    • C. Watkins. Dynamic alignment kernels. In A.J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, pages 39-50, Cambridge, MA, 2000. MIT Press. URL http://www.cs.rhul.ac.uk/home/chrisw/dynk.ps.gz.
    • (2000) Advances in Large Margin Classifiers , pp. 39-50
    • Watkins, C.1
  • 81
    • 84890520049 scopus 로고    scopus 로고
    • Use of the zero norm with linear models and kernel methods
    • J.Weston, A. Elisseff, B. Schoelkopf, and M. Tipping. Use of the zero norm with linear models and kernel methods. JMLR, 3:1439-1461, 2003.
    • (2003) JMLR , vol.3 , pp. 1439-1461
    • Weston, J.1    Elisseff, A.2    Schoelkopf, B.3    Tipping, M.4
  • 82
    • 51749118482 scopus 로고    scopus 로고
    • Agnostic learning with ensembles of classifiers
    • Orlando, Florida, Aug. INNS/IEEE
    • J. Wichard. Agnostic learning with ensembles of classifiers. In Proc. IJCNN07, Orlando, Florida, Aug 2007. INNS/IEEE.
    • (2007) Proc. IJCNN07
    • Wichard, J.1
  • 83
    • 44649123652 scopus 로고    scopus 로고
    • Multi-class discriminant kernel learning via convex programming
    • I. Guyon and A. Saffari, editors, JMLR, Apr
    • J. Ye, S. Ji, and J. Chen. Multi-class discriminant kernel learning via convex programming. In I. Guyon and A. Saffari, editors, JMLR, Special Topic on Model Selection, volume 9, pages 719- 758, Apr 2008. URL http://www.jmlr.org/ papers/volume9/ye08b/ye08b.pdf.
    • (2008) Special Topic on Model Selection , vol.9 , pp. 719-758
    • Ye, J.1    Ji, S.2    Chen, J.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.