메뉴 건너뛰기




Volumn 32, Issue 2, 2010, Pages 175-189

Regularized least square regression with dependent samples

Author keywords

Capacity independent error bounds; Integral operator; Regularized least square regression; Strong mixing condition

Indexed keywords


EID: 77951769307     PISSN: 10197168     EISSN: None     Source Type: Journal    
DOI: 10.1007/s10444-008-9099-y     Document Type: Article
Times cited : (60)

References (19)
  • 1
    • 5844297152 scopus 로고
    • Theory of reproducing kernels
    • Aronszajn, N.: Theory of reproducing kernels. Trans. Amer. Math. Soc. 68, 337-404 (1950).
    • (1950) Trans. Amer. Math. Soc. , vol.68 , pp. 337-404
    • Aronszajn, N.1
  • 2
    • 0022863898 scopus 로고
    • Mixing properties of Harris chains and autoregressive processes
    • Athreya, K. B., Pantula, S. G.: Mixing properties of Harris chains and autoregressive processes. J. Appl. Probab. 23, 880-892 (1986).
    • (1986) J. Appl. Probab. , vol.23 , pp. 880-892
    • Athreya, K.B.1    Pantula, S.G.2
  • 3
    • 0038453192 scopus 로고    scopus 로고
    • Rademacher and Gaussian complexities: Risk bounds and structural results
    • Bartlett, P. L., Mendelson, S.: Rademacher and Gaussian complexities: risk bounds and structural results. J. Mach. Learn. Res. 3, 463-482 (2002).
    • (2002) J. Mach. Learn. Res. , vol.3 , pp. 463-482
    • Bartlett, P.L.1    Mendelson, S.2
  • 7
    • 0000448413 scopus 로고
    • The invariance principle for stationary processes
    • Davydov, Y. A.: The invariance principle for stationary processes. Theory Probab. Appl. 14, 487-498 (1970).
    • (1970) Theory Probab. Appl. , vol.14 , pp. 487-498
    • Davydov, Y.A.1
  • 8
    • 0000033883 scopus 로고
    • Almost sure invariance principles for weakly dependent vector-valued random variables
    • Dehling, H., Philipp, W.: Almost sure invariance principles for weakly dependent vector-valued random variables. Ann. Probab. 10, 689-701 (1982).
    • (1982) Ann. Probab. , vol.10 , pp. 689-701
    • Dehling, H.1    Philipp, W.2
  • 9
    • 0034419669 scopus 로고    scopus 로고
    • Regularization networks and support vector machines
    • Evgeniou, T., Pontil, M., Poggio, T.: Regularization networks and support vector machines. Adv. Comput. Math. 13, 1-50 (2000).
    • (2000) Adv. Comput. Math. , vol.13 , pp. 1-50
    • Evgeniou, T.1    Pontil, M.2    Poggio, T.3
  • 10
    • 33745879746 scopus 로고    scopus 로고
    • Support vector machines with beta-mixing input sequences
    • J. Wang (Ed.), New York: Springer
    • Li, L. Q., Wan, C. G.: Support vector machines with beta-mixing input sequences. In: Wang, J., et al. (eds.) Lecture Notes on Computer Science, vol. 3971, pp. 928-935. Springer, New York (2006).
    • (2006) Lecture Notes on Computer Science , vol.3971 , pp. 928-935
    • Li, L.Q.1    Wan, C.G.2
  • 11
    • 0000973081 scopus 로고    scopus 로고
    • Minimum complexity regression estimation with weakly dependent observations
    • Modha, D. S.: Minimum complexity regression estimation with weakly dependent observations. IEEE. Trans. Inform. Theory 42, 2133-2145 (1996).
    • (1996) IEEE. Trans. Inform. Theory , vol.42 , pp. 2133-2145
    • Modha, D.S.1
  • 12
    • 3042850649 scopus 로고    scopus 로고
    • Shannon sampling and function reconstruction from point values
    • Smale, S., Zhou, D. X.: Shannon sampling and function reconstruction from point values. Bull. Amer. Math. Soc. 41, 279-305 (2004).
    • (2004) Bull. Amer. Math. Soc. , vol.41 , pp. 279-305
    • Smale, S.1    Zhou, D.X.2
  • 13
    • 27844555491 scopus 로고    scopus 로고
    • Shannon sampling II: Connections to learning theory
    • Smale, S., Zhou, D. X.: Shannon sampling II: connections to learning theory. Appl. Comput. Harmon. Anal. 19, 285-302 (2005).
    • (2005) Appl. Comput. Harmon. Anal. , vol.19 , pp. 285-302
    • Smale, S.1    Zhou, D.X.2
  • 14
    • 34547455409 scopus 로고    scopus 로고
    • Learning theory estimates via integral operators and their approximations
    • Smale, S., Zhou, D. X.: Learning theory estimates via integral operators and their approximations. Constr. Approx. 26, 153-172 (2007).
    • (2007) Constr. Approx. , vol.26 , pp. 153-172
    • Smale, S.1    Zhou, D.X.2
  • 16
    • 0025635525 scopus 로고    scopus 로고
    • Connectionist nonparametric regression: Multilayer feedforward networks can learn arbitrary mappings
    • Withers, C. S.: Connectionist nonparametric regression: multilayer feedforward networks can learn arbitrary mappings. Neural Netw. 3, 535-549 (2000).
    • (2000) Neural Netw. , vol.3 , pp. 535-549
    • Withers, C.S.1
  • 17
    • 33744772341 scopus 로고    scopus 로고
    • Learning rates of least-square regularized regression
    • Wu, Q., Ying, Y. M., Zhou, D. X.: Learning rates of least-square regularized regression. Found. Comput. Math. 6, 171-192 (2006).
    • (2006) Found. Comput. Math. , vol.6 , pp. 171-192
    • Wu, Q.1    Ying, Y.M.2    Zhou, D.X.3
  • 18
    • 40949097762 scopus 로고    scopus 로고
    • Learning rates of regularized regression for exponentially strongly mixing sequence
    • Xu, Y. L., Chen, D. R.: Learning rates of regularized regression for exponentially strongly mixing sequence. J. Statist. Plann. Inference 138(7), 2180-2189 (2008).
    • (2008) J. Statist. Plann. Inference , vol.138 , Issue.7 , pp. 2180-2189
    • Xu, Y.L.1    Chen, D.R.2
  • 19
    • 0042879446 scopus 로고    scopus 로고
    • Leave-one-out bounds for kernel methods
    • Zhang, T.: Leave-one-out bounds for kernel methods. Neural Comput. 15, 1397-1437 (2003).
    • (2003) Neural Comput. , vol.15 , pp. 1397-1437
    • Zhang, T.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.