메뉴 건너뛰기




Volumn 3559 LNAI, Issue , 2005, Pages 323-337

Approximating a gram matrix for improved kernel-based learning

Author keywords

[No Author keywords available]

Indexed keywords

APPROXIMATION THEORY; LEARNING ALGORITHMS; LEARNING SYSTEMS; PROBABILITY DISTRIBUTIONS; RANDOM PROCESSES;

EID: 26944440870     PISSN: 03029743     EISSN: 16113349     Source Type: Book Series    
DOI: 10.1007/11503415_22     Document Type: Conference Paper
Times cited : (32)

References (33)
  • 4
    • 0042378381 scopus 로고    scopus 로고
    • Laplacian eigenmaps for dimensionality reduction and data representation
    • M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373-1396, 2003.
    • (2003) Neural Computation , vol.15 , Issue.6 , pp. 1373-1396
    • Belkin, M.1    Niyogi, P.2
  • 9
    • 0037948870 scopus 로고    scopus 로고
    • Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
    • D.L. Donoho and C. Grimes. Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. USA, 100(10):5591-5596, 2003.
    • (2003) Proc. Natl. Acad. Sci. USA , vol.100 , Issue.10 , pp. 5591-5596
    • Donoho, D.L.1    Grimes, C.2
  • 13
    • 24144491457 scopus 로고    scopus 로고
    • Fast Monte Carlo algorithms for matrices I: Approximating matrix multiplication
    • Yale University Department of Computer Science, New Haven, CT, February
    • P. Drineas, R. Kannan, and M.W. Mahoney. Fast Monte Carlo algorithms for matrices I: Approximating matrix multiplication. Technical Report YALEU/DCS/TR-1269, Yale University Department of Computer Science, New Haven, CT, February 2004.
    • (2004) Technical Report , vol.YALEU-DCS-TR-1269
    • Drineas, P.1    Kannan, R.2    Mahoney, M.W.3
  • 14
    • 20444450258 scopus 로고    scopus 로고
    • Fast Monte Carlo algorithms for matrices II: Computing a low-rank approximation to a matrix
    • Yale University Department of Computer Science, New Haven, CT, February
    • P. Drineas, R. Kannan, and M.W. Mahoney. Fast Monte Carlo algorithms for matrices II: Computing a low-rank approximation to a matrix. Technical Report YALEU/DCS/TR-1270, Yale University Department of Computer Science, New Haven, CT, February 2004.
    • (2004) Technical Report , vol.YALEU-DCS-TR-1270
    • Drineas, P.1    Kannan, R.2    Mahoney, M.W.3
  • 15
    • 24144499504 scopus 로고    scopus 로고
    • Fast Monte Carlo algorithms for matrices III: Computing a compressed approximate matrix decomposition
    • Yale University Department of Computer Science, New Haven, CT, February
    • P. Drineas, R. Kannan, and M.W. Mahoney. Fast Monte Carlo algorithms for matrices III: Computing a compressed approximate matrix decomposition. Technical Report YALEU/DCS/TR-1271, Yale University Department of Computer Science, New Haven, CT, February 2004.
    • (2004) Technical Report , vol.YALEU-DCS-TR-1271
    • Drineas, P.1    Kannan, R.2    Mahoney, M.W.3
  • 16
    • 24144451439 scopus 로고    scopus 로고
    • Sampling sub-problems of heterogeneous Max-Cut problems and approximation algorithms
    • Yale University Department of Computer Science, New Haven, CT, April
    • P. Drineas, R. Kannan, and M.W. Mahoney. Sampling sub-problems of heterogeneous Max-Cut problems and approximation algorithms. Technical Report YALEU/DCS/TR-1283, Yale University Department of Computer Science, New Haven, CT, April 2004.
    • (2004) Technical Report , vol.YALEU-DCS-TR-1283
    • Drineas, P.1    Kannan, R.2    Mahoney, M.W.3
  • 18
    • 26944490571 scopus 로고    scopus 로고
    • On the Nyström method for approximating a Gram matrix for improved kernel-based learning
    • Yale University Department of Computer Science, New Haven, CT, April
    • P. Drineas and M.W. Mahoney. On the Nyström method for approximating a Gram matrix for improved kernel-based learning. Technical Report 1319, Yale University Department of Computer Science, New Haven, CT, April 2005.
    • (2005) Technical Report , vol.1319
    • Drineas, P.1    Mahoney, M.W.2
  • 19
    • 0041494125 scopus 로고    scopus 로고
    • Efficient SVM training using low-rank kernel representations
    • S. Fine and K. Scheinberg. Efficient SVM training using low-rank kernel representations. Journal of Machine Learning Research, 2:243-264, 2001.
    • (2001) Journal of Machine Learning Research , vol.2 , pp. 243-264
    • Fine, S.1    Scheinberg, K.2
  • 22
    • 4344609668 scopus 로고    scopus 로고
    • A kernel view of the dimensionality reduction of manifolds
    • Max Planck Institute for Biological Cybernetics, July
    • J. Ham, D.D. Lee, S. Mika, and B. Schölkopf. A kernel view of the dimensionality reduction of manifolds. Technical Report TR-110, Max Planck Institute for Biological Cybernetics, July 2003.
    • (2003) Technical Report , vol.TR-110
    • Ham, J.1    Lee, D.D.2    Mika, S.3    Schölkopf, B.4
  • 26
    • 0034704222 scopus 로고    scopus 로고
    • Nonlinear dimensionality reduction by local linear embedding
    • S.T. Roweis and L.K. Saul. Nonlinear dimensionality reduction by local linear embedding. Science, 290:2323-2326, 2000.
    • (2000) Science , vol.290 , pp. 2323-2326
    • Roweis, S.T.1    Saul, L.K.2
  • 27
    • 0347243182 scopus 로고    scopus 로고
    • Nonlinear component analysis as a kernel eigenvalue problem
    • B. Schölkopf, A. Smola, and K.-R. Müller. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10:1299-1319, 1998.
    • (1998) Neural Computation , vol.10 , pp. 1299-1319
    • Schölkopf, B.1    Smola, A.2    Müller, K.-R.3
  • 29
    • 0034704229 scopus 로고    scopus 로고
    • A global geometric framework for nonlinear dimensionality reduction
    • J.B. Tenenbaum, V. de Silva, and J.C. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290:2319-2323, 2000.
    • (2000) Science , vol.290 , pp. 2319-2323
    • Tenenbaum, J.B.1    De Silva, V.2    Langford, J.C.3
  • 31
    • 26944431850 scopus 로고    scopus 로고
    • Observations on the Nyström method for Gaussian process prediction
    • University of Edinburgh
    • C.K.I. Williams, C.E. Rasmussen, A. Schwaighofer, and V. Tresp. Observations on the Nyström method for Gaussian process prediction. Technical report, University of Edinburgh, 2002.
    • (2002) Technical Report
    • Williams, C.K.I.1    Rasmussen, C.E.2    Schwaighofer, A.3    Tresp, V.4


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.