메뉴 건너뛰기




Volumn 2015-January, Issue , 2015, Pages 1657-1665

Less is more: Nyström computational regularization

Author keywords

[No Author keywords available]

Indexed keywords

INFORMATION SCIENCE;

EID: 84965167586     PISSN: 10495258     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (316)

References (34)
  • 2
    • 0002493574 scopus 로고    scopus 로고
    • Sparse greedy matrix approximation for machine learning
    • Morgan Kaufmann
    • Alex J. Smola and Bernhard Schölkopf. Sparse Greedy Matrix Approximation for Machine Learning. In ICML, pages 911-918. Morgan Kaufmann, 2000.
    • (2000) ICML , pp. 911-918
    • Smola, A.J.1    Schölkopf, B.2
  • 3
    • 0000350486 scopus 로고    scopus 로고
    • Using the nyström method to speed up kernel machines
    • MIT Press
    • C. Williams and M. Seeger. Using the Nyström Method to Speed Up Kernel Machines. In NIPS, pages 682-688. MIT Press, 2000.
    • (2000) NIPS , pp. 682-688
    • Williams, C.1    Seeger, M.2
  • 4
    • 77953218689 scopus 로고    scopus 로고
    • Random features for large-scale kernel machines
    • Curran Associates, Inc.
    • Ali Rahimi and Benjamin Recht. Random Features for Large-Scale Kernel Machines. In NIPS, pages 1177-1184. Curran Associates, Inc., 2007.
    • (2007) NIPS , pp. 1177-1184
    • Rahimi, A.1    Recht, B.2
  • 6
    • 84898989411 scopus 로고    scopus 로고
    • Fastfood - Computing hilbert space expansions in loglinear time
    • JMLR.org
    • Quoc V. Le, Tamás Sarlós, and Alexander J. Smola. Fastfood - Computing Hilbert Space Expansions in loglinear time. In ICML, volume 28 of JMLR Proceedings, pages 244-252. JMLR.org, 2013.
    • (2013) ICML, Volume 28 of JMLR Proceedings , pp. 244-252
    • Le, Q.V.1    Sarlós, T.2    Smola, A.J.3
  • 9
    • 84862292598 scopus 로고    scopus 로고
    • Ensemble nystrom method
    • S. Kumar, M. Mohri, and A. Talwalkar. Ensemble Nystrom Method. In NIPS, pages 1060-1068, 2009.
    • (2009) NIPS , pp. 1060-1068
    • Kumar, S.1    Mohri, M.2    Talwalkar, A.3
  • 10
    • 77956549641 scopus 로고    scopus 로고
    • Making large-scale nyström approximation possible
    • Omnipress
    • Mu Li, James T. Kwok, and Bao-Liang Lu. Making Large-Scale Nyström Approximation Possible. In ICML, pages 631-638. Omnipress, 2010.
    • (2010) ICML , pp. 631-638
    • Li, M.1    Kwok, J.T.2    Lu, B.-L.3
  • 11
    • 56449087564 scopus 로고    scopus 로고
    • Improved nyström low-rank approximation and error analysis
    • ACM
    • Kai Zhang, Ivor W. Tsang, and James T. Kwok. Improved Nyström Low-rank Approximation and Error Analysis. ICML, pages 1232-1239. ACM, 2008.
    • (2008) ICML , pp. 1232-1239
    • Zhang, K.1    Tsang, I.W.2    Kwok, J.T.3
  • 12
    • 84937855981 scopus 로고    scopus 로고
    • Scalable kernel methods via doubly stochastic gradients
    • 0002, and, In
    • Bo Dai, Bo Xie 0002. Niao He, Yingyu Liang, Anant Raj, Maria-Florina Balcan, and Le Song. Scalable Kernel Methods via Doubly Stochastic Gradients. In NIPS, pages 3041-3049, 2014.
    • (2014) NIPS , pp. 3041-3049
    • Dai, B.1    Xie, B.2    He, N.3    Liang, Y.4    Raj, A.5    Balcan, M.6    Le Song7
  • 13
    • 29244453931 scopus 로고    scopus 로고
    • On the nyström method for approximating a gram matrix for improved kernel-based learning
    • December
    • Petros Drineas and Michael W. Mahoney. On the Nyström Method for Approximating a Gram Matrix for Improved Kernel-Based Learning. JMLR, 6:2153-2175, December 2005.
    • (2005) JMLR , vol.6 , pp. 2153-2175
    • Drineas, P.1    Mahoney, M.W.2
  • 15
    • 84885649693 scopus 로고    scopus 로고
    • Improving CUR matrix decomposition and the nyström approximation via adaptive sampling
    • Shusen Wang and Zhihua Zhang. Improving CUR Matrix Decomposition and the Nyström Approximation via Adaptive Sampling. JMLR, 14(1):2729-2769, 2013.
    • (2013) JMLR , vol.14 , Issue.1 , pp. 2729-2769
    • Wang, S.1    Zhang, Z.2
  • 16
    • 84873435224 scopus 로고    scopus 로고
    • Fast approximation of matrix coherence and statistical leverage
    • Petros Drineas, Malik Magdon-Ismail, Michael W. Mahoney, and David P. Woodruff. Fast approximation of matrix coherence and statistical leverage. JMLR, 13:3475-3506, 2012.
    • (2012) JMLR , vol.13 , pp. 3475-3506
    • Drineas, P.1    Magdon-Ismail, M.2    Mahoney, M.W.3    Woodruff, D.P.4
  • 17
    • 84922209704 scopus 로고    scopus 로고
    • Uniform sampling for matrix approximation
    • ACM
    • Michael B. Cohen, Yin Tat Lee, Cameron Musco, Christopher Musco, Richard Peng, and Aaron Sidford. Uniform Sampling for Matrix Approximation. In ITCS, pages 181-190. ACM, 2015.
    • (2015) ITCS , pp. 181-190
    • Cohen, M.B.1    Lee, Y.T.2    Musco, C.3    Musco, C.4    Peng, R.5    Sidford, A.6
  • 18
    • 84955443267 scopus 로고    scopus 로고
    • Efficient algorithms and error analysis for the modified nystrom method
    • JMLR.org
    • Shusen Wang and Zhihua Zhang. Efficient Algorithms and Error Analysis for the Modified Nystrom Method. In AISTATS, volume 33 of JMLR Proceedings, pages 996-1004. JMLR.org, 2014.
    • (2014) AISTATS, Volume 33 of JMLR Proceedings , pp. 996-1004
    • Wang, S.1    Zhang, Z.2
  • 19
    • 84860672647 scopus 로고    scopus 로고
    • Sampling methods for the Nyström method
    • S. Kumar, M. Mohri, and A. Talwalkar. Sampling methods for the Nyström method. JMLR, 13(1):981-1006, 2012.
    • (2012) JMLR , vol.13 , Issue.1 , pp. 981-1006
    • Kumar, S.1    Mohri, M.2    Talwalkar, A.3
  • 20
    • 84862278427 scopus 로고    scopus 로고
    • On the impact of kernel approximation on learning accuracy
    • JMLR.org
    • Corinna Cortes, Mehryar Mohri, and Ameet Talwalkar. On the Impact of Kernel Approximation on Learning Accuracy. In AISTATS, volume 9 of JMLR Proceedings, pages 113-120. JMLR.org, 2010.
    • (2010) AISTATS, Volume 9 of JMLR Proceedings , pp. 113-120
    • Cortes, C.1    Mohri, M.2    Talwalkar, A.3
  • 21
    • 84884361899 scopus 로고    scopus 로고
    • Improved bounds for the nyström method with application to kernel classification
    • Oct.
    • R. Jin, T. Yang, M. Mahdavi, Y. Li, and Z. Zhou. Improved Bounds for the Nyström Method With Application to Kernel Classification. Information Theory, IEEE Transactions on, 59(10), Oct. 2013.
    • (2013) Information Theory, IEEE Transactions on , vol.59 , Issue.10
    • Jin, R.1    Yang, T.2    Mahdavi, M.3    Li, Y.4    Zhou, Z.5
  • 22
    • 84877740547 scopus 로고    scopus 로고
    • Nyström method vs random fourier features: A theoretical and empirical comparison
    • Tianbao Yang, Yu-Feng Li, Mehrdad Mahdavi, Rong Jin, and Zhi-Hua Zhou. Nyström Method vs Random Fourier Features: A Theoretical and Empirical Comparison. In NIPS, pages 485-493, 2012.
    • (2012) NIPS , pp. 485-493
    • Yang, T.1    Li, Y.2    Mahdavi, M.3    Jin, R.4    Zhou, Z.5
  • 23
    • 84898034803 scopus 로고    scopus 로고
    • Sharp analysis of low-rank kernel matrix approximations
    • Francis Bach. Sharp analysis of low-rank kernel matrix approximations. In COLT, volume 30, 2013.
    • (2013) COLT , vol.30
    • Bach, F.1
  • 26
    • 34548537866 scopus 로고    scopus 로고
    • Optimal rates for the regularized least-squares algorithm
    • Andrea Caponnetto and Ernesto De Vito. Optimal rates for the regularized least-squares algorithm. Foundations of Computational Mathematics, 7(3):331-368, 2007.
    • (2007) Foundations of Computational Mathematics , vol.7 , Issue.3 , pp. 331-368
    • Caponnetto, A.1    De Vito, E.2
  • 28
    • 84898072914 scopus 로고    scopus 로고
    • Optimal rates for regularized least squares regression
    • I. Steinwart, D. Hush, and C. Scovel. Optimal rates for regularized least squares regression. In COLT, 2009.
    • (2009) COLT
    • Steinwart, I.1    Hush, D.2    Scovel, C.3
  • 30
    • 33846829559 scopus 로고    scopus 로고
    • On regularization algorithms in learning theory
    • F. Bauer, S. Pereverzev, and L. Rosasco. On regularization algorithms in learning theory. Journal of complexity, 23(1):52-72, 2007.
    • (2007) Journal of Complexity , vol.23 , Issue.1 , pp. 52-72
    • Bauer, F.1    Pereverzev, S.2    Rosasco, L.3
  • 31
    • 84965138547 scopus 로고    scopus 로고
    • Adaptive rates for regularization operators in learning theory
    • A. Caponnetto and Yuan Yao. Adaptive rates for regularization operators in learning theory. Analysis and Applications, 08, 2010.
    • (2010) Analysis and Applications , pp. 08
    • Caponnetto, A.1    Yao, Y.2
  • 33
    • 84898969606 scopus 로고    scopus 로고
    • On the sample complexity of subspace learning
    • Alessandro Rudi, Guillermo D. Canas, and Lorenzo Rosasco. On the Sample Complexity of Subspace Learning. In NIPS, pages 2067-2075, 2013.
    • (2013) NIPS , pp. 2067-2075
    • Rudi, A.1    Canas, G.D.2    Rosasco, L.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.