메뉴 건너뛰기




Volumn , Issue , 2010, Pages 74-84

Proximity-based graph embeddings for multi-label classification

Author keywords

Adjacency graph; Dimensionality reduction; Embedding; Multi label classification; Supervised

Indexed keywords

ADJACENCY GRAPHS; DIMENSIONALITY REDUCTION; EMBEDDING; MULTI-LABEL CLASSIFICATIONS; SUPERVISED;

EID: 78651427272     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (2)

References (45)
  • 1
    • 33645323768 scopus 로고    scopus 로고
    • Hierarchical multi-label prediction of gene function
    • Barutcuoglu, Z., Schapire, R. E., and Troyanskaya, O. G. (2006). Hierarchical multi-label prediction of gene function. Bioinformatics, 22(7): 830-836.
    • (2006) Bioinformatics , vol.22 , Issue.7 , pp. 830-836
    • Barutcuoglu, Z.1    Schapire, R.E.2    Troyanskaya, O.G.3
  • 3
    • 0042378381 scopus 로고    scopus 로고
    • Laplacian eigenmaps for dimensionality reduction and data representation
    • DOI 10.1162/089976603321780317
    • Belkin, M. and Niyogi, P. (2003). Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6): 1373-1396. (Pubitemid 37049796)
    • (2003) Neural Computation , vol.15 , Issue.6 , pp. 1373-1396
    • Belkin, M.1    Niyogi, P.2
  • 7
    • 37849040752 scopus 로고    scopus 로고
    • Spectral regression: A unified subspace learning framework for contentbased image retrieval
    • Cai, D., He, X., and Han, J. (2007a). Spectral regression: A unified subspace learning framework for contentbased image retrieval. In Proc. of the ACM Conference on Multimedia.
    • (2007) Proc. of the ACM Conference on Multimedia
    • Cai, D.1    He, X.2    Han, J.3
  • 13
    • 2942723846 scopus 로고    scopus 로고
    • A division information-theoretic feature clustering algorithm for text classification
    • Dhillon, I. S., Mallela, S., and Kumar, R. (2003). A division information-theoretic feature clustering algorithm for text classification. Journal of Machine Learning Research, 3: 1265-1287.
    • (2003) Journal of Machine Learning Research , vol.3 , pp. 1265-1287
    • Dhillon, I.S.1    Mallela, S.2    Kumar, R.3
  • 14
    • 0000764772 scopus 로고
    • The use of multiple measurements in taxonomic problems
    • Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems. Annals of Eugenics, 7(2): 179-188.
    • (1936) Annals of Eugenics , vol.7 , Issue.2 , pp. 179-188
    • Fisher, R.A.1
  • 15
    • 0021938963 scopus 로고
    • Clustering to minimize the maximum intercluster distance
    • Gonzalez, T. F. (1985). Clustering to minimize the maximum intercluster distance. Theoretical Computer Science, 38: 23-306.
    • (1985) Theoretical Computer Science , vol.38 , pp. 23-306
    • Gonzalez, T.F.1
  • 16
    • 10044285992 scopus 로고    scopus 로고
    • Canonical correlation analysis: An overview with application to learning methods
    • Hardoon, D. R., Szedmak, S. R., and Shawe-taylor, J. R. (2004). Canonical correlation analysis: An overview with application to learning methods. Neural Computation, 16(12): 2639-2664.
    • (2004) Neural Computation , vol.16 , Issue.12 , pp. 2639-2664
    • Hardoon, D.R.1    Szedmak, S.R.2    Shawe-taylor, J.R.3
  • 17
    • 13444266062 scopus 로고    scopus 로고
    • Incremental semi-supervised subspace learning for image retrieval
    • He, X. (2004). Incremental semi-supervised subspace learning for image retrieval. In Proc. of the ACM Conference on Multimedia.
    • (2004) Proc. of the ACM Conference on Multimedia
    • He, X.1
  • 21
    • 0036604873 scopus 로고    scopus 로고
    • Prototype optimization for nearest-neighbor classification
    • Huang, Y., Chiang, C., Shieh, J., and Grimson, W. (2002). Prototype optimization for nearest-neighbor classification. Pattern Recognition, (6): 12371245.
    • (2002) Pattern Recognition , vol.6 , pp. 12371245
    • Huang, Y.1    Chiang, C.2    Shieh, J.3    Grimson, W.4
  • 23
    • 21844475661 scopus 로고    scopus 로고
    • Dimension reduction in text classification with support vector machines
    • Kim, H., Howland, P., and Pari, H. (2005). Dimension reduction in text classification with support vector machines. Journal of Machine Learning Research, 6: 3753.
    • (2005) Journal of Machine Learning Research , vol.6 , pp. 3753
    • Kim, H.1    Howland, P.2    Pari, H.3
  • 24
    • 36248950635 scopus 로고    scopus 로고
    • Orthogonal neighborhood preserving projections: A projection-based dimensionality reduction technique
    • DOI 10.1109/TPAMI.2007.1131
    • Kokiopoulou, E. and Saad, Y. (2007). Orthogonal neighborhood preserving projections: A projectionbased dimensionality reduction technique. IEEE Trans. on Pattern Analysis and Machine Intelligence, 29(12): 2143-2156. (Pubitemid 350124063)
    • (2007) IEEE Transactions on Pattern Analysis and Machine Intelligence , vol.29 , Issue.12 , pp. 2143-2156
    • Kokiopoulou, E.1    Saad, Y.2
  • 25
    • 67649413071 scopus 로고    scopus 로고
    • Enhanced graph-based dimensionality reduction with repulsion laplaceans
    • Kokiopouloua, E. and Saadb, Y. (2009). Enhanced graph-based dimensionality reduction with repulsion laplaceans. Pattern Recognition, 42: 2392-2402.
    • (2009) Pattern Recognition , vol.42 , pp. 2392-2402
    • Kokiopouloua, E.1    Saadb, Y.2
  • 26
    • 0002312061 scopus 로고
    • Feature selection and feature extraction for text categorization
    • Harriman, New York
    • Lewis, D. D. (1992). Feature selection and feature extraction for text categorization. In Proc. of the workshop on Speech and Natural Language, pages 212-217, Harriman, New York.
    • (1992) Proc. of the Workshop on Speech and Natural Language , pp. 212-217
    • Lewis, D.D.1
  • 28
    • 34548583274 scopus 로고    scopus 로고
    • A tutorial on spectral clustering
    • Luxburg, U. (2007). A tutorial on spectral clustering. Statistics and Computing, 17(4).
    • (2007) Statistics and Computing , vol.17 , pp. 4
    • Luxburg, U.1
  • 29
    • 0036887535 scopus 로고    scopus 로고
    • An efficient prototype merging strategy for the condensed 1-nn rule through class-conditional hierarchical clustering
    • Mollineda, R. and and E. Vidal, F F (2002). An efficient prototype merging strategy for the condensed 1-nn rule through class-conditional hierarchical clustering. Pattern Recognition, (12): 27712782.
    • (2002) Pattern Recognition , vol.12 , pp. 27712782
    • Mollineda, R.1    Vidal F F, E.2
  • 30
    • 0036604999 scopus 로고    scopus 로고
    • Dissimilarity representations allow for building good classifiers
    • DOI 10.1016/S0167-8655(02)00024-7, PII S0167865502000247
    • Pekalska, E. and Duin, R. (2002). Dissimilarity representations allow for building good classifiers. Pattern Recognition Letters, (8): 943-956. (Pubitemid 34416229)
    • (2002) Pattern Recognition Letters , vol.23 , Issue.8 , pp. 943-956
    • Pekalska, E.1    Duin, R.P.W.2
  • 31
    • 27744546228 scopus 로고    scopus 로고
    • Prototype selection for dissimilarity-based classifiers
    • Pekalska, E., Duin, R., and Paclik, P. (2006). Prototype selection for dissimilarity-based classifiers. Pattern Recognition, (2): 189-208.
    • (2006) Pattern Recognition , vol.2 , pp. 189-208
    • Pekalska, E.1    Duin, R.2    Paclik, P.3
  • 32
    • 0034704222 scopus 로고    scopus 로고
    • Nonlinear dimensionality reduction by locally linear embedding
    • Roweis, S. T. and Saul, L. K. (2000). Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500): 2323-2326.
    • (2000) Science , vol.290 , Issue.5500 , pp. 2323-2326
    • Roweis, S.T.1    Saul, L.K.2
  • 34
    • 0010786475 scopus 로고    scopus 로고
    • On the influence of the kernel on the consistency of support vector machines
    • Steinwart, I. (2001). On the influence of the kernel on the consistency of support vector machines. Journal of Machine Learning Research, 2: 67-93.
    • (2001) Journal of Machine Learning Research , vol.2 , pp. 67-93
    • Steinwart, I.1
  • 35
    • 34249086815 scopus 로고    scopus 로고
    • Dimensionality reduction of multimodal labeled data by local fisher discriminant analysis
    • Sugiyama, M. (2007). Dimensionality reduction of multimodal labeled data by local fisher discriminant analysis. Journal of Machine Learning Research, 8: 1027-1061.
    • (2007) Journal of Machine Learning Research , vol.8 , pp. 1027-1061
    • Sugiyama, M.1
  • 36
    • 77952423823 scopus 로고    scopus 로고
    • Semi-supervised local fisher discriminant analysis for dimensionality reduction
    • Sugiyama, M. (2010). Semi-supervised local fisher discriminant analysis for dimensionality reduction. Machine Learning, 78(1-2): 35-61.
    • (2010) Machine Learning , vol.78 , Issue.1-2 , pp. 35-61
    • Sugiyama, M.1
  • 43
    • 34547982893 scopus 로고    scopus 로고
    • Optimal dimensionality of metric space for classification
    • DOI 10.1145/1273496.1273639, Proceedings, Twenty-Fourth International Conference on Machine Learning, ICML 2007
    • Zhang, W., Xue, X., Sun, Z., Guo, Y., and Lu, H. (2007). Optimal dimensionality of metric space for classification Proc. of the 24th International Conf. on machine learning, ICML, volume 227, pages 1135-1142. (Pubitemid 47275181)
    • (2007) ACM International Conference Proceeding Series , vol.227 , pp. 1135-1142
    • Zhang, W.1    Xue, X.2    Sun, Z.3    Guo, Y.-F.4    Lu, H.5
  • 45
    • 57749121315 scopus 로고    scopus 로고
    • Multi-label dimensionality reduction via dependence maximization
    • Chicago, Illinois
    • Zhang, Y. and Zhou, Z. (2007). Multi-label dimensionality reduction via dependence maximization. In Proc. of the 23rd National Conf. on Artificial intelligence, volume 3, pages 1503-1505, Chicago, Illinois.
    • (2007) Proc. of the 23rd National Conf. on Artificial Intelligence , vol.3 , pp. 1503-1505
    • Zhang, Y.1    Zhou, Z.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.