메뉴 건너뛰기




Volumn 2, Issue , 2008, Pages 741-773

Classification with minimax fast rates for classes of Bayes rules with sparse representation

Author keywords

Aggregation; Classification; Decision dyadic trees; Minimax rates; Sparsity

Indexed keywords


EID: 80053459148     PISSN: 19357524     EISSN: None     Source Type: Journal    
DOI: 10.1214/07-EJS015     Document Type: Article
Times cited : (12)

References (26)
  • 2
    • 5844297152 scopus 로고
    • Theory of reproducing kernels
    • N. Aronszajn. Theory of reproducing kernels. Trans. Am. Math. Soc., 68:337-404, 1950.
    • (1950) Trans. Am. Math. Soc. , vol.68 , pp. 337-404
    • Aronszajn, N.1
  • 3
    • 0009290472 scopus 로고
    • Deux remarques sur l’estimation
    • French
    • P. Assouad. Deux remarques sur l’estimation. C. R. Acad. Sci. Paris Sér. I Math., 296(23):1021-1024, 1983. French.
    • (1983) C. R. Acad. Sci. Paris Sér. I Math. , vol.296 , Issue.23 , pp. 1021-1024
    • Assouad, P.1
  • 11
    • 33746090509 scopus 로고    scopus 로고
    • Optimal oracle inequality for aggregation of classifiers under low noise condition
    • G. Lecué. Optimal oracle inequality for aggregation of classifiers under low noise condition. In Proceeding of the 19th Annual Conference on Learning Theory, COLT 2006, 32(4):364-378, 2006.
    • (2006) Proceeding of the 19Th Annual Conference on Learning Theory, COLT 2006 , vol.32 , Issue.4 , pp. 364-378
    • Lecué, G.1
  • 12
    • 47249139879 scopus 로고    scopus 로고
    • Optimal rates of aggregation in classification under low noise assumption
    • G. Lecué. Optimal rates of aggregation in classification under low noise assumption. Bernoulli, 13(4):1000-1022, 2007.
    • (2007) Bernoulli , vol.13 , Issue.4 , pp. 1000-1022
    • Lecué, G.1
  • 13
    • 9444269961 scopus 로고    scopus 로고
    • On the bayes-risk consistency of regularized boosting methods
    • G. Lugosi and N. Vayatis. On the bayes-risk consistency of regularized boosting methods. Ann. Statist., 32(1):30-55, 2004.
    • (2004) Ann. Statist. , vol.32 , Issue.1 , pp. 30-55
    • Lugosi, G.1    Vayatis, N.2
  • 14
    • 0033234630 scopus 로고    scopus 로고
    • Smooth discrimination analysis
    • E. Mammen and A.B. Tsybakov. Smooth discrimination analysis. Ann. Statist., 27:1808-1829, 1999.
    • (1999) Ann. Statist. , vol.27 , pp. 1808-1829
    • Mammen, E.1    Tsybakov, A.B.2
  • 15
    • 33746243474 scopus 로고    scopus 로고
    • Risk Bound for Statistical Learning
    • P. Massart and E. Nédélec. Risk Bound for Statistical Learning. Ann. Statist., 34(5), 2006.
    • (2006) Ann. Statist. , vol.34 , Issue.5
    • Massart, P.1    Nédélec, E.2
  • 17
    • 0002431740 scopus 로고    scopus 로고
    • Automatic construction of decision trees from data: A multi-disciplinary survey
    • S. Murthy. Automatic construction of decision trees from data: A multi-disciplinary survey. Data Mining and knowledge Discovery, 2(4):345-389, 1998.
    • (1998) Data Mining and Knowledge Discovery , vol.2 , Issue.4 , pp. 345-389
    • Murthy, S.1
  • 19
    • 33645724205 scopus 로고    scopus 로고
    • Minimax-optimal classification with dyadic decision trees
    • C. Scott and R. Nowak. Minimax-optimal classification with dyadic decision trees. IEEE Transactions on Information Theory, 52(4):1335-1353, April 2006.
    • (2006) IEEE Transactions on Information Theory , vol.52 , Issue.4 , pp. 1335-1353
    • Scott, C.1    Nowak, R.2
  • 21
    • 34247197035 scopus 로고    scopus 로고
    • Fast Rates for Support Vector Machines using Gaussian Kernels
    • I. Steinwart and C. Scovel. Fast Rates for Support Vector Machines using Gaussian Kernels. Ann. Statist., 35(2), April 2007.
    • (2007) Ann. Statist.
    • Steinwart, I.1    Scovel, C.2
  • 23
    • 3142725508 scopus 로고    scopus 로고
    • Optimal aggregation of classifiers in statistical learning
    • A.B. Tsybakov. Optimal aggregation of classifiers in statistical learning. Ann. Statist., 32(1):135-166, 2004.
    • (2004) Ann. Statist. , vol.32 , Issue.1 , pp. 135-166
    • Tsybakov, A.B.1
  • 24
    • 23744505130 scopus 로고    scopus 로고
    • Square root penalty: Adaptation to the margin in classification and in edge estimation
    • A.B. Tsybakov and S.A. van de Geer. Square root penalty: adaptation to the margin in classification and in edge estimation. Ann. Statist., 33:1203-1224, 2005.
    • (2005) Ann. Statist. , vol.33 , pp. 1203-1224
    • Tsybakov, A.B.1    Van De Geer, S.A.2
  • 25
    • 0033321586 scopus 로고    scopus 로고
    • Minimax nonparametric classification-part I: Rates of convergence
    • Y. Yang. Minimax nonparametric classification-part I: Rates of convergence. IEEE Transaction on Information Theory, 45:2271-2284, 1999a.
    • (1999) IEEE Transaction on Information Theory , vol.45 , pp. 2271-2284
    • Yang, Y.1
  • 26
    • 0033356514 scopus 로고    scopus 로고
    • Minimax nonparametric classification-part II: Model selection for adaptation
    • Y. Yang. Minimax nonparametric classification-part II: Model selection for adaptation. IEEE Transaction on Information Theory, 45:2285-2292, 1999b.
    • (1999) IEEE Transaction on Information Theory , vol.45 , pp. 2285-2292
    • Yang, Y.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.