메뉴 건너뛰기




Volumn , Issue , 2011, Pages

κ-NN regression adapts to local intrinsic dimension

Author keywords

[No Author keywords available]

Indexed keywords

CLUSTERING ALGORITHMS; SET THEORY;

EID: 85162400498     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (126)

References (17)
  • 1
    • 0000439527 scopus 로고
    • Optimal rates of convergence for non-parametric estimators
    • C. J. Stone. Optimal rates of convergence for non-parametric estimators. Ann. Statist., 8:1348-1360, 1980.
    • (1980) Ann. Statist. , vol.8 , pp. 1348-1360
    • Stone, C.J.1
  • 2
    • 0000439527 scopus 로고
    • Optimal global rates of convergence for non-parametric estimators
    • C. J. Stone. Optimal global rates of convergence for non-parametric estimators. Ann. Statist., 10:1340-1353, 1982.
    • (1982) Ann. Statist. , vol.10 , pp. 1340-1353
    • Stone, C.J.1
  • 3
    • 0034704222 scopus 로고    scopus 로고
    • Nonlinear dimensionality reduction by locally linear embedding
    • S. Roweis and L. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290, 2000.
    • (2000) Science , vol.290
    • Roweis, S.1    Saul, L.2
  • 4
    • 0034704229 scopus 로고    scopus 로고
    • A global geometric framework for nonlinear dimensionality reduction
    • J. Tenebaum, V. de Silva, and J. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290, 2000.
    • (2000) Science , vol.290
    • Tenebaum, J.1    De Silva, V.2    Langford, J.3
  • 5
    • 0042378381 scopus 로고    scopus 로고
    • Laplacian eigenmaps for dimensionality reduction and data representation
    • M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373-1396, 2003.
    • (2003) Neural Computation , vol.15 , Issue.6 , pp. 1373-1396
    • Belkin, M.1    Niyogi, P.2
  • 7
    • 84898079484 scopus 로고    scopus 로고
    • Escaping the curse of dimensionality with a tree-based regressor
    • S. Kpotufe. Escaping the curse of dimensionality with a tree-based regressor. Conference On Learning Theory, 2009.
    • (2009) Conference on Learning Theory
    • Kpotufe, S.1
  • 9
    • 0029341263 scopus 로고
    • Rates of convergence of nearest neighbor estimation under arbitrary sampling
    • S. Kulkarni and S. Posner. Rates of convergence of nearest neighbor estimation under arbitrary sampling. IEEE Transactions on Information Theory, 41, 1995.
    • (1995) IEEE Transactions on Information Theory , vol.41
    • Kulkarni, S.1    Posner, S.2
  • 14
    • 0001024505 scopus 로고
    • On the uniform convergence of relative frequencies of events to their expectation
    • V. Vapnik and A. Chervonenkis. On the uniform convergence of relative frequencies of events to their expectation. Theory of probability and its applications, 16:264-280, 1971.
    • (1971) Theory of Probability and Its Applications , vol.16 , pp. 264-280
    • Vapnik, V.1    Chervonenkis, A.2
  • 17
    • 0142095814 scopus 로고
    • Rate of convergence for the wild bootstrap in nonparametric regression
    • R. Cao-Abad. Rate of convergence for the wild bootstrap in nonparametric regression. Annals of Statistics, 19:2226-2231, 1991.
    • (1991) Annals of Statistics , vol.19 , pp. 2226-2231
    • Cao-Abad, R.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.