메뉴 건너뛰기




Volumn 51, Issue 11, 2005, Pages 3806-3819

A Neyman-Pearson approach to statistical learning

Author keywords

Generalization error bounds; Neyman Pearson (NP) classification; Statistical learning theory

Indexed keywords

CONVERGENCE OF NUMERICAL METHODS; DECISION THEORY; ERROR ANALYSIS; PROBABILITY; STATISTICAL METHODS; THEOREM PROVING; TREES (MATHEMATICS);

EID: 27744553952     PISSN: 00189448     EISSN: None     Source Type: Journal    
DOI: 10.1109/TIT.2005.856955     Document Type: Article
Times cited : (139)

References (38)
  • 4
    • 0013396289 scopus 로고
    • "The Fisher, Neyman-Pearson theories of testing hypotheses: One theory or two?"
    • Dec.
    • E. L. Lehmann, "The Fisher, Neyman-Pearson theories of testing hypotheses: One theory or two?," J. Amer. Statist. Assoc., vol. 88, no. 424, pp. 1242-1249, Dec. 1993.
    • (1993) J. Amer. Statist. Assoc. , vol.88 , Issue.424 , pp. 1242-1249
    • Lehmann, E.L.1
  • 5
    • 0346605836 scopus 로고    scopus 로고
    • "Statistical challenges in functional genomics"
    • P. Sebastiani, E. Gussoni, I. S. Kohane, and M. Ramoni, "Statistical challenges in functional genomics," Statist. Sci., vol. 18, no. 1, pp. 3340, 2003.
    • (2003) Statist. Sci. , vol.18 , Issue.1 , pp. 3340
    • Sebastiani, P.1    Gussoni, E.2    Kohane, I.S.3    Ramoni, M.4
  • 6
    • 0036489046 scopus 로고    scopus 로고
    • "Comparison of discrimination methods for the classification of tumors using gene expression data"
    • Mar.
    • S. Dudoit, J. Fridlyand, and T. P. Speed, "Comparison of discrimination methods for the classification of tumors using gene expression data," J. Amer. Statist. Assoc., vol. 97, no. 457, pp. 77-87, Mar. 2002.
    • (2002) J. Amer. Statist. Assoc. , vol.97 , Issue.457 , pp. 77-87
    • Dudoit, S.1    Fridlyand, J.2    Speed, T.P.3
  • 7
    • 33749245586 scopus 로고    scopus 로고
    • "Cost sensitive learning by cost-proportionate example weighting"
    • Melbourne, FL
    • B. Zadrozny, J. Langford, and N. Abe, "Cost sensitive learning by cost-proportionate example weighting," in Proc. 3rd Int. Conf. Data Mining, Melbourne, FL, 2003.
    • (2003) Proc. 3rd Int. Conf. Data Mining
    • Zadrozny, B.1    Langford, J.2    Abe, N.3
  • 8
    • 0002106691 scopus 로고    scopus 로고
    • "Metacost: A general method for making classifiers cost sensitive"
    • San Diego, CA
    • P. Domingos, "Metacost: A general method for making classifiers cost sensitive," in Proc. 5th Int. Conf. Knowledge Discovery and Data Mining, San Diego, CA, 1999, pp. 155-164.
    • (1999) Proc. 5th Int. Conf. Knowledge Discovery and Data Mining , pp. 155-164
    • Domingos, P.1
  • 10
    • 84945318292 scopus 로고    scopus 로고
    • "Class probability estimation and cost-sensitive classification decisions"
    • Helsinki, Finland
    • D. Margineantu, "Class probability estimation and cost-sensitive classification decisions," in Proc. 13th European Conf. Machine Learning, Helsinki, Finland, 2002, pp. 270-281.
    • (2002) Proc. 13th European Conf. Machine Learning , pp. 270-281
    • Margineantu, D.1
  • 11
    • 0038045155 scopus 로고    scopus 로고
    • "Radial basis function neural networks for nonlinear Fisher discrimination and Neyman-Pearson classification"
    • D. Casasent and X.-W. Chen, "Radial basis function neural networks for nonlinear Fisher discrimination and Neyman-Pearson classification," Neural Netw., vol. 16, pp. 529-535, 2003.
    • (2003) Neural Netw. , vol.16 , pp. 529-535
    • Casasent, D.1    Chen, X.-W.2
  • 12
    • 27744586991 scopus 로고    scopus 로고
    • Learning With the Neyman-Pearson and min-max Criteria
    • Los; Alamos National Laboratory. [Online]. Available
    • A. Cannon, J. Howse, D. Hush, and C. Scovel. (2002) Learning With the Neyman-Pearson and min-max Criteria. Los Alamos National Laboratory. [Online]. Available: http://www.c3.lanl.gov/kelly/ml/pubs/2002minmax/ paper.pdf
    • (2002)
    • Cannon, A.1    Howse, J.2    Hush, D.3    Scovel, C.4
  • 13
    • 21244461734 scopus 로고    scopus 로고
    • Simple Classifiers
    • Los Alamos National Laboratory. [Online]. Available
    • A. Cannon, J. Howse, D. Hush, and C. Scovel. (2003) Simple Classifiers. Los Alamos National Laboratory. [Online]. Available: http://www.c3.lanl. gov/ml/pubs/2003sclassifiers/abstract.shtml
    • (2003)
    • Cannon, A.1    Howse, J.2    Hush, D.3    Scovel, C.4
  • 15
    • 0001024505 scopus 로고
    • "On the uniform convergence of relative frequencies of events to their probabilities"
    • V. Vapnik and C. Chervonenkis, "On the uniform convergence of relative frequencies of events to their probabilities," Theory Probab. Its Applic., vol. 16, no. 2, pp. 264-280, 1971.
    • (1971) Theory Probab. Its Applic. , vol.16 , Issue.2 , pp. 264-280
    • Vapnik, V.1    Chervonenkis, C.2
  • 17
    • 0035397715 scopus 로고    scopus 로고
    • "Rademacher penalties and structural risk minimization"
    • Jul.
    • V. Koltchinskii, "Rademacher penalties and structural risk minimization," IEEE Trans. Inf. Theory, vol. 47, no. 5, pp. 1902-1914, Jul. 2001.
    • (2001) IEEE Trans. Inf. Theory , vol.47 , Issue.5 , pp. 1902-1914
    • Koltchinskii, V.1
  • 18
    • 0036643049 scopus 로고    scopus 로고
    • "Model selection and error estimation"
    • P. Bartlett, S. Boucheron, and G. Lugosi, "Model selection and error estimation," Mach. Learn., vol. 48, pp. 85-113, 2002.
    • (2002) Mach. Learn. , vol.48 , pp. 85-113
    • Bartlett, P.1    Boucheron, S.2    Lugosi, G.3
  • 20
    • 0029754587 scopus 로고    scopus 로고
    • "Concept learning using complexity regularization"
    • Jan.
    • G. Lugosi and K. Zeger, "Concept learning using complexity regularization," IEEE Trans. Inf. Theory, vol. 42, no. 1, pp. 48-54, Jan. 1996.
    • (1996) IEEE Trans. Inf. Theory , vol.42 , Issue.1 , pp. 48-54
    • Lugosi, G.1    Zeger, K.2
  • 21
    • 26944434577 scopus 로고    scopus 로고
    • Theory of Classification: A Survey of Recent Advances
    • [Online]. Available
    • S. Boucheron, O. Bousquet, and G. Lugosi. (2004) Theory of Classification: A Survey of Recent Advances. [Online]. Available: http://www.econ.upf.es/lugosi/
    • (2004)
    • Boucheron, S.1    Bousquet, O.2    Lugosi, G.3
  • 22
    • 3142725508 scopus 로고    scopus 로고
    • "Optimal aggregation of classifiers in statistical learning"
    • A. B. Tsybakov, "Optimal aggregation of classifiers in statistical learning," Ann. Statist., vol. 32, no. 1, pp. 135-166, 2004.
    • (2004) Ann. Statist. , vol.32 , Issue.1 , pp. 135-166
    • Tsybakov, A.B.1
  • 23
    • 27744456548 scopus 로고    scopus 로고
    • "On the adaptive properties of decision trees"
    • L. K. Saul, Y. Weiss, and L. Bottou, Eds. Cambridge, MA: MIT Press
    • C. Scott and R. Nowak, "On the adaptive properties of decision trees," in Advances in Neural Information Processing Systems 17, L. K. Saul, Y. Weiss, and L. Bottou, Eds. Cambridge, MA: MIT Press, 2005, pp. 1225-1232.
    • (2005) Advances in Neural Information Processing Systems 17 , pp. 1225-1232
    • Scott, C.1    Nowak, R.2
  • 24
    • 84898950233 scopus 로고    scopus 로고
    • "Fast rates to bayes for kernel machines"
    • L. K. Saul, Y. Weiss, and L. Bottou, Eds. Cambridge, MA: MIT Press
    • I. Steinwart and C. Scovel, "Fast rates to bayes for kernel machines," in Advances in Neural Information Processing Systems 17, L. K. Saul, Y. Weiss, and L. Bottou, Eds. Cambridge, MA: MIT Press, 2005, pp. 1337-1344.
    • (2005) Advances in Neural Information Processing Systems 17 , pp. 1337-1344
    • Steinwart, I.1    Scovel, C.2
  • 25
    • 3042675892 scopus 로고    scopus 로고
    • "On the rate of convergence of regularized boosting classifiers"
    • G. Blanchard, G. Lugosi, and N. Vayatis, "On the rate of convergence of regularized boosting classifiers," J. Mach. Learn. Res., vol. 4, pp. 861-894, 2003.
    • (2003) J. Mach. Learn. Res. , vol.4 , pp. 861-894
    • Blanchard, G.1    Lugosi, G.2    Vayatis, N.3
  • 26
    • 27744487224 scopus 로고    scopus 로고
    • Square Root Penalty: Adaptation to the Margin in Classification and in Edge Estimation
    • [Online]. Available
    • A. B. Tsybakov and S. A. van de Geer. (2004) Square Root Penalty: Adaptation to the Margin in Classification and in Edge Estimation. [Online]. Available: http://www.proba.jussieu.fr/pageperso/tsybakov/ tsybakov.html
    • (2004)
    • Tsybakov, A.B.1    van de Geer, S.A.2
  • 27
    • 0020098693 scopus 로고
    • "Any discrimination rule can have an arbitrarily bad probability of error for finite sample size"
    • Mar.
    • L. Devroye, "Any discrimination rule can have an arbitrarily bad probability of error for finite sample size," IEEE Trans. Pattern Anal. Mach. Intell., vol. PAMI-4, pp. 154-157, Mar. 1982.
    • (1982) IEEE Trans. Pattern Anal. Mach. Intell. , vol.PAMI-4 , pp. 154-157
    • Devroye, L.1
  • 28
    • 27744558619 scopus 로고    scopus 로고
    • Minimax Optimal Classification With Dyadic Decision Trees
    • Rice Univ., Houston, TX. [Online]. Available
    • C. Scott and R. Nowak. Minimax Optimal Classification With Dyadic Decision Trees. Rice Univ., Houston, TX. [Online]. Available: http://www. stat.rice.edu/~cscott
    • Scott, C.1    Nowak, R.2
  • 29
    • 84898998236 scopus 로고    scopus 로고
    • "Dyadic classification trees via structural risk minimization"
    • S. Becker, S. Thrun, and K. Obermayer, Eds. Cambridge, MA
    • C. Scott and R. Nowak, "Dyadic classification trees via structural risk minimization," in Advances in Neural Information Processing Systems 15, S. Becker, S. Thrun, and K. Obermayer, Eds. Cambridge, MA, 2003.
    • (2003) Advances in Neural Information Processing Systems 15
    • Scott, C.1    Nowak, R.2
  • 30
    • 27744464536 scopus 로고    scopus 로고
    • "Near-minimax optimal classification with dyadic classification trees"
    • S. Thrun, L. Saul, and B. Schölkopf, Eds. Cambridge, MA: MIT Press
    • C. Scott and R. Nowak, "Near-minimax optimal classification with dyadic classification trees," in Advances in Neural Information Processing Systems 16, S. Thrun, L. Saul, and B. Schölkopf, Eds. Cambridge, MA: MIT Press, 2004.
    • (2004) Advances in Neural Information Processing Systems 16
    • Scott, C.1    Nowak, R.2
  • 31
    • 85009724776 scopus 로고    scopus 로고
    • "Nonlinear approximation"
    • R. A. DeVore, "Nonlinear approximation," Acta Numer., vol. 7, pp. 51-150, 1998.
    • (1998) Acta Numer. , vol.7 , pp. 51-150
    • DeVore, R.A.1
  • 33
    • 0033248623 scopus 로고    scopus 로고
    • "Wedgelets: Nearly minimax estimation of edges"
    • D. Donoho, "Wedgelets: Nearly minimax estimation of edges," Ann. Statist., vol. 27, pp. 859-897, 1999.
    • (1999) Ann. Statist. , vol.27 , pp. 859-897
    • Donoho, D.1
  • 34
    • 9444269346 scopus 로고    scopus 로고
    • "Oracle bounds and exact algorithm for dyadic classification trees"
    • J. Shawe-Taylor and Y. Singer, Eds. Heidelberg, Germany: Springer-Verlag
    • G. Blanchard, C. Schäfer, and Y. Rozenholc, "Oracle bounds and exact algorithm for dyadic classification trees," in Learning Theory: Proc. 17th Annu. Conf. Learning Theory, COLT 2004, J. Shawe-Taylor and Y. Singer, Eds. Heidelberg, Germany: Springer-Verlag, 2004, pp. 378-392.
    • (2004) Learning Theory: Proc. 17th Annu. Conf. Learning Theory, COLT 2004 , pp. 378-392
    • Blanchard, G.1    Schäfer, C.2    Rozenholc, Y.3
  • 35
    • 0031508912 scopus 로고    scopus 로고
    • "CART and best-ortho-basis selection: A connection"
    • D. Donoho, "CART and best-ortho-basis selection: A connection," Ann. Statist., vol. 25, pp. 1870-1911, 1997.
    • (1997) Ann. Statist. , vol.25 , pp. 1870-1911
    • Donoho, D.1
  • 37
    • 0025383763 scopus 로고
    • "A guided tour of Chernoff bounds"
    • T. Hagerup and C. Rüb, "A guided tour of Chernoff bounds," Inf. Process. Lett., vol. 33, no. 6, pp. 305-308, 1990.
    • (1990) Inf. Process. Lett. , vol.33 , Issue.6 , pp. 305-308
    • Hagerup, T.1    Rüb, C.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.