메뉴 건너뛰기




Volumn , Issue , 2010, Pages 247-254

Unsupervised feature selection: Minimize information redundancy of features

Author keywords

Eigen decomposition; Feature; Gaussian elimination; PCA; Selection; Unsupervised

Indexed keywords

EIGEN DECOMPOSITION; FEATURE; GAUSSIAN-ELIMINATION; PCA; SELECTION; UNSUPERVISED;

EID: 79951750086     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1109/TAAI.2010.49     Document Type: Conference Paper
Times cited : (6)

References (21)
  • 4
    • 1942451938 scopus 로고    scopus 로고
    • Feature selection for high-dimensional data: A fast correlation-based filter solution
    • L. Yu, and H. Liu, "Feature Selection for High-Dimensional Data: A Fast Correlation-Based Filter Solution," in Proc. 20th International Conference on Machine Learning, 2003, pp. 856-863.
    • (2003) Proc. 20th International Conference on Machine Learning , pp. 856-863
    • Yu, L.1    Liu, H.2
  • 6
    • 24344458137 scopus 로고    scopus 로고
    • Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy
    • P. H., L. F., and D. C.
    • P. H., L. F., and D. C., "Feature Selection Based on Mutual Information: Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy, " in IEEE Transactions on Pattern and Machine Intelligence, 2005, pp. 1226-1238.
    • (2005) IEEE Transactions on Pattern and Machine Intelligence , pp. 1226-1238
  • 11
    • 0021427712 scopus 로고
    • Principal variables
    • G. P. McCabe, "Principal Variables," in Technometrics, 1984, pp. 137-144.
    • (1984) Technometrics , pp. 137-144
    • McCabe, G.P.1
  • 12
    • 34548203998 scopus 로고    scopus 로고
    • Dimension reduction via principal variables
    • DOI 10.1016/j.csda.2007.02.012, PII S0167947307000564
    • J. A. Cumming, and D. A. Wooff, "Dimension Reduction via Principal Variables," in Computational Statistics & Data Analysis, 2007, pp. 550-565. (Pubitemid 47331724)
    • (2007) Computational Statistics and Data Analysis , vol.52 , Issue.1 , pp. 550-565
    • Cumming, J.A.1    Wooff, D.A.2
  • 13
    • 85065703189 scopus 로고    scopus 로고
    • Correlation-based feature selection for discrete and numeric class machine learning
    • M. A. Hall, "Correlation-Based Feature Selection for Discrete and Numeric Class Machine Learning," in Proc. 17th International Conference on Machine Learning, 2000, pp. 359-366.
    • (2000) Proc. 17th International Conference on Machine Learning , pp. 359-366
    • Hall, M.A.1
  • 14
    • 0031381525 scopus 로고    scopus 로고
    • Wrappers for feature subset selection
    • PII S000437029700043X
    • R. Kohavi, and G. H. John, "Wrappers for Feature Subset Selection," in Artificial Intelligence, 1997, pp. 273-324. (Pubitemid 127401107)
    • (1997) Artificial Intelligence , vol.97 , Issue.1-2 , pp. 273-324
    • Kohavi, R.1    John, G.H.2
  • 16
    • 85002377847 scopus 로고
    • Genetic algorithms as a strategy for feature selection
    • L. R., R. R., and T. M.
    • L. R., R. R., and T. M., "Genetic Algorithms as a Strategy for Feature Selection," in J. Chemometrics, 1992, pp. 267-281.
    • (1992) J. Chemometrics , pp. 267-281
  • 19
    • 0002457803 scopus 로고
    • Selection of variables to preserve multivariate data structure using principal components
    • W. J. Krzanowski, "Selection of Variables to Preserve Multivariate Data Structure Using Principal Components," in Applied Statistics, 1987, pp. 22-33.
    • (1987) Applied Statistics , pp. 22-33
    • Krzanowski, W.J.1
  • 20
    • 0001247848 scopus 로고
    • Loading and correlations in the interpretation of principal components
    • J. Cadima, and I. T. Jolliffe, "Loading and Correlations in the Interpretation of Principal Components," in J. Appl. Statist, 1995, pp. 203-214.
    • (1995) J. Appl. Statist , pp. 203-214
    • Cadima, J.1    Jolliffe, I.T.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.