메뉴 건너뛰기




Volumn 6, Issue , 2005, Pages 2153-2175

On the Nyström method for approximating a Gram matrix for improved kernel-based learning

Author keywords

Gram matrix; Kernel methods; Nystr m method; Randomized algorithms

Indexed keywords

ALGORITHMS; COMPUTATION THEORY; INTEGRAL EQUATIONS; LEARNING SYSTEMS; MATRIX ALGEBRA; PROBABILITY DISTRIBUTIONS;

EID: 29244453931     PISSN: 15337928     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (940)

References (42)
  • 4
    • 0042378381 scopus 로고    scopus 로고
    • Laplacian eigenmaps for dimensionality reduction and data representation
    • M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6): 1373-1396, 2003.
    • (2003) Neural Computation , vol.15 , Issue.6 , pp. 1373-1396
    • Belkin, M.1    Niyogi, P.2
  • 7
    • 0004151496 scopus 로고    scopus 로고
    • Springer-Verlag, New York
    • R. Bhatia. Matrix Analysis. Springer-Verlag, New York, 1997.
    • (1997) Matrix Analysis
    • Bhatia, R.1
  • 11
    • 0037948870 scopus 로고    scopus 로고
    • Hessian eigenmaps: Locally linear embedding techniques for highdimensional data
    • D. L. Donoho and C. Grimes. Hessian eigenmaps: Locally linear embedding techniques for highdimensional data. Proc. Natl. Acad. Sci. USA, 100(10):5591-5596, 2003.
    • (2003) Proc. Natl. Acad. Sci. USA , vol.100 , Issue.10 , pp. 5591-5596
    • Donoho, D.L.1    Grimes, C.2
  • 15
    • 24144491457 scopus 로고    scopus 로고
    • Fast Monte Carlo algorithms for matrices I: Approximating matrix multiplication
    • Technical Report YALEU/DCS/TR-1269, Yale University Department of Computer Science, New Haven, CT, February. Accepted for publication in the
    • P. Drineas, R. Kannan, and M. W. Mahoney. Fast Monte Carlo algorithms for matrices I: Approximating matrix multiplication. Technical Report YALEU/DCS/TR-1269, Yale University Department of Computer Science, New Haven, CT, February 2004a. Accepted for publication in the SIAM Journal on Computing.
    • (2004) SIAM Journal on Computing
    • Drineas, P.1    Kannan, R.2    Mahoney, M.W.3
  • 16
    • 20444450258 scopus 로고    scopus 로고
    • Fast Monte Carlo algorithms for matrices II: Computing a low-rank approximation to a matrix
    • Technical Report YALEU/DCS/TR-1270, Yale University Department of Computer Science, New Haven, CT, February. Accepted for publication in the
    • P. Drineas, R. Kannan, and M. W. Mahoney. Fast Monte Carlo algorithms for matrices II: Computing a low-rank approximation to a matrix. Technical Report YALEU/DCS/TR-1270, Yale University Department of Computer Science, New Haven, CT, February 2004b. Accepted for publication in the SIAM Journal on Computing.
    • (2004) SIAM Journal on Computing
    • Drineas, P.1    Kannan, R.2    Mahoney, M.W.3
  • 17
    • 24144499504 scopus 로고    scopus 로고
    • Fast Monte Carlo algorithms for matrices III: Computing a compressed approximate matrix decomposition
    • Technical Report YALEU/DCS/TR-1271, Yale University Department of Computer Science, New Haven, CT, February. Accepted for publication in the
    • P. Drineas, R. Kannan, and M. W. Mahoney. Fast Monte Carlo algorithms for matrices III: Computing a compressed approximate matrix decomposition. Technical Report YALEU/DCS/TR-1271, Yale University Department of Computer Science, New Haven, CT, February 2004c. Accepted for publication in the SIAM Journal on Computing.
    • (2004) SIAM Journal on Computing
    • Drineas, P.1    Kannan, R.2    Mahoney, M.W.3
  • 18
    • 24144451439 scopus 로고    scopus 로고
    • Sampling sub-problems of heterogeneous Max-cut problems and approximation algorithms
    • Yale University Department of Computer Science, New Haven, CT, April
    • P. Drineas, R. Kannan, and M. W. Mahoney. Sampling sub-problems of heterogeneous Max-Cut problems and approximation algorithms. Technical Report YALEU/DCS/TR-1283, Yale University Department of Computer Science, New Haven, CT, April 2004d.
    • (2004) Technical Report , vol.YALEU-DCS-TR-1283
    • Drineas, P.1    Kannan, R.2    Mahoney, M.W.3
  • 21
    • 26944490571 scopus 로고    scopus 로고
    • On the Nyström method for approximating a Gram matrix for improved kernel-based learning
    • Yale University Department of Computer Science, New Haven, CT, April
    • P. Drineas and M. W. Mahoney. On the Nyström method for approximating a Gram matrix for improved kernel-based learning. Technical Report YALEU/DCS/TR-1319, Yale University Department of Computer Science, New Haven, CT, April 2005b.
    • (2005) Technical Report , vol.YALEU-DCS-TR-1319
    • Drineas, P.1    Mahoney, M.W.2
  • 22
    • 0041494125 scopus 로고    scopus 로고
    • Efficient SVM training using low-rank kernel representations
    • S. Fine and K. Scheinberg. Efficient SVM training using low-rank kernel representations. Journal of Machine Learning Research, 2:243-264, 2001.
    • (2001) Journal of Machine Learning Research , vol.2 , pp. 243-264
    • Fine, S.1    Scheinberg, K.2
  • 26
    • 0037790752 scopus 로고    scopus 로고
    • The maximum-volume concept in approximation by lowrank matrices
    • S. A. Goreinov and E. E. Tyrtyshnikov. The maximum-volume concept in approximation by lowrank matrices. Contemporary Mathematics, 280:47-51, 2001.
    • (2001) Contemporary Mathematics , vol.280 , pp. 47-51
    • Goreinov, S.A.1    Tyrtyshnikov, E.E.2
  • 28
    • 4344609668 scopus 로고    scopus 로고
    • A kernel view of the dimensionality reduction of manifolds
    • Max Planck Institute for Biological Cybernetics, July
    • J. Ham, D. D. Lee, S. Mika, and B. Schölkopf. A kernel view of the dimensionality reduction of manifolds. Technical Report TR-110, Max Planck Institute for Biological Cybernetics, July 2003.
    • (2003) Technical Report , vol.TR-110
    • Ham, J.1    Lee, D.D.2    Mika, S.3    Schölkopf, B.4
  • 33
    • 29244454088 scopus 로고    scopus 로고
    • Matrix approximation and projective clustering via iterative sampling
    • Massachusetts Institute of Technology, Cambridge, MA, March
    • L. Rademacher, S. Vempala, and G. Wang. Matrix approximation and projective clustering via iterative sampling. Technical Report MIT-LCS-TR-983, Massachusetts Institute of Technology, Cambridge, MA, March 2005.
    • (2005) Technical Report , vol.MIT-LCS-TR-983
    • Rademacher, L.1    Vempala, S.2    Wang, G.3
  • 34
    • 0034704222 scopus 로고    scopus 로고
    • Nonlinear dimensionality reduction by local linear embedding
    • S. T. Roweis and L. K. Saul. Nonlinear dimensionality reduction by local linear embedding. Science, 290:2323-2326, 2000.
    • (2000) Science , vol.290 , pp. 2323-2326
    • Roweis, S.T.1    Saul, L.K.2
  • 35
    • 0347243182 scopus 로고    scopus 로고
    • Nonlinear component analysis as a kernel eigenvalue problem
    • B. Schölkopf, A. Smola, and K.-R. Müller. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10:1299-1319, 1998.
    • (1998) Neural Computation , vol.10 , pp. 1299-1319
    • Schölkopf, B.1    Smola, A.2    Müller, K.-R.3
  • 38
    • 0034704229 scopus 로고    scopus 로고
    • A global geometric framework for nonlinear dimensionality reduction
    • J. B. Tenenbaum, V. de Silva, and J. C. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290:2319-2323, 2000.
    • (2000) Science , vol.290 , pp. 2319-2323
    • Tenenbaum, J.B.1    De Silva, V.2    Langford, J.C.3
  • 40
    • 26944431850 scopus 로고    scopus 로고
    • Observations on the Nyström method for Gaussian process prediction
    • University of Edinburgh
    • C. K. I. Williams, C. E. Rasmussen, A. Schwaighofer, and V. Tresp. Observations on the Nyström method for Gaussian process prediction. Technical report, University of Edinburgh, 2002.
    • (2002) Technical Report
    • Williams, C.K.I.1    Rasmussen, C.E.2    Schwaighofer, A.3    Tresp, V.4


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.