메뉴 건너뛰기




Volumn 64, Issue , 2014, Pages 70-80

Feature selection using data envelopment analysis

Author keywords

Data envelopment analysis; Feature selection; Redundancy; Relevance; Super efficiency

Indexed keywords

DATA ENVELOPMENT ANALYSIS; FEATURE EXTRACTION; REDUNDANCY;

EID: 84901192462     PISSN: 09507051     EISSN: None     Source Type: Journal    
DOI: 10.1016/j.knosys.2014.03.022     Document Type: Article
Times cited : (45)

References (55)
  • 1
    • 25144492516 scopus 로고    scopus 로고
    • Efficient feature selection via analysis of relevance and redundancy
    • L. Yu, and H. Liu Efficient feature selection via analysis of relevance and redundancy J. Mach. Learn. Res. 5 2004 1205 1224
    • (2004) J. Mach. Learn. Res. , vol.5 , pp. 1205-1224
    • Yu, L.1    Liu, H.2
  • 2
    • 84868655354 scopus 로고    scopus 로고
    • Mutual information based input feature selection for classification problems
    • S. Cang, and H. Yu Mutual information based input feature selection for classification problems Decis. Support Syst. 54 2012 691 698
    • (2012) Decis. Support Syst. , vol.54 , pp. 691-698
    • Cang, S.1    Yu, H.2
  • 3
    • 33745561205 scopus 로고    scopus 로고
    • An introduction to variable and features election
    • I. Guyon, and A. Elisseeff An introduction to variable and features election J. Mach. Learn. Res. 3 2003 1157 1182
    • (2003) J. Mach. Learn. Res. , vol.3 , pp. 1157-1182
    • Guyon, I.1    Elisseeff, A.2
  • 4
    • 17044405923 scopus 로고    scopus 로고
    • Toward integrating features election algorithms for classification and clustering
    • H. Liu, and L. Yu Toward integrating features election algorithms for classification and clustering IEEE Trans. Knowl. Data Eng. 17 4 2005 491 502
    • (2005) IEEE Trans. Knowl. Data Eng. , vol.17 , Issue.4 , pp. 491-502
    • Liu, H.1    Yu, L.2
  • 6
    • 0036161259 scopus 로고    scopus 로고
    • Gene selection for cancer classification using support vector machines
    • I. Guyon, J. Weston, S. Barnhill, and V. Vapnik Gene selection for cancer classification using support vector machines Mach. Learn. 46 2002 389 422
    • (2002) Mach. Learn. , vol.46 , pp. 389-422
    • Guyon, I.1    Weston, J.2    Barnhill, S.3    Vapnik, V.4
  • 7
    • 0033636139 scopus 로고    scopus 로고
    • Support vector machine classification and validation of cancer tissue samples using microarray expression data
    • T.S. Furey, N. Cristianini, N. Duffy, D.W. Bednarski, M. Schummer, and D. Haussler Support vector machine classification and validation of cancer tissue samples using microarray expression data Bioinformatics 16 10 2000 906 914
    • (2000) Bioinformatics , vol.16 , Issue.10 , pp. 906-914
    • Furey, T.S.1    Cristianini, N.2    Duffy, N.3    Bednarski, D.W.4    Schummer, M.5    Haussler, D.6
  • 8
    • 27644496932 scopus 로고    scopus 로고
    • A new dependency and correlation analysis for features
    • G. Qu, S. Hariri, and M. Yousif A new dependency and correlation analysis for features IEEE Trans. Knowl. Data Eng. 17 9 2005 1199 1207
    • (2005) IEEE Trans. Knowl. Data Eng. , vol.17 , Issue.9 , pp. 1199-1207
    • Qu, G.1    Hariri, S.2    Yousif, M.3
  • 9
    • 24344458137 scopus 로고    scopus 로고
    • Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy
    • H. Peng, F. Long, and C. Ding Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy IEEE Trans. Pattern Anal. Mach. Intell. 27 8 2005 1226 1238
    • (2005) IEEE Trans. Pattern Anal. Mach. Intell. , vol.27 , Issue.8 , pp. 1226-1238
    • Peng, H.1    Long, F.2    Ding, C.3
  • 10
    • 12144251725 scopus 로고    scopus 로고
    • Effective feature selection scheme using mutual information
    • D. Huang, and T.W.S. Chow Effective feature selection scheme using mutual information Neurocomputing 63 2005 325 343
    • (2005) Neurocomputing , vol.63 , pp. 325-343
    • Huang, D.1    Chow, T.W.S.2
  • 11
    • 40649115462 scopus 로고    scopus 로고
    • A parameterless feature ranking algorithm based on mi
    • J.J. Huang, Y.Z. Cai, and X.M. Xu A parameterless feature ranking algorithm based on mi Neurocomputing 71 2008 1656 1668
    • (2008) Neurocomputing , vol.71 , pp. 1656-1668
    • Huang, J.J.1    Cai, Y.Z.2    Xu, X.M.3
  • 16
    • 84992726552 scopus 로고
    • Estimating attributes: Analysis and extensions of relief
    • Springer-Verlag New York, Inc. Secaucus, NJ, USA
    • I. Kononenko Estimating attributes: analysis and extensions of relief Proceedings of European Conference on Machine Learning, ECML'94 1994 Springer-Verlag New York, Inc. Secaucus, NJ, USA 171 182
    • (1994) Proceedings of European Conference on Machine Learning, ECML'94 , pp. 171-182
    • Kononenko, I.1
  • 17
    • 0028468293 scopus 로고
    • Using mi for selecting features in supervised neural net learning
    • R. Battiti Using mi for selecting features in supervised neural net learning IEEE Trans. Neural Netw. 5 4 1994 537 550
    • (1994) IEEE Trans. Neural Netw. , vol.5 , Issue.4 , pp. 537-550
    • Battiti, R.1
  • 18
    • 85065703189 scopus 로고    scopus 로고
    • Correlation-based feature selection for discrete and numeric class machine learning
    • Morgan Kaufmann Los Altos, CA, USA
    • M.A. Hall Correlation-based feature selection for discrete and numeric class machine learning Proceedings of the 7th International Conference on Machine Learning, ICML'00 2000 Morgan Kaufmann Los Altos, CA, USA 359 366
    • (2000) Proceedings of the 7th International Conference on Machine Learning, ICML'00 , pp. 359-366
    • Hall, M.A.1
  • 19
    • 33645690579 scopus 로고    scopus 로고
    • Fast binary feature selection with conditional mutual information
    • F. Fleuret Fast binary feature selection with conditional mutual information J. Mach. Learn. Res. 5 2004 1531 1555
    • (2004) J. Mach. Learn. Res. , vol.5 , pp. 1531-1555
    • Fleuret, F.1
  • 20
    • 76749137632 scopus 로고    scopus 로고
    • Local causal and markov blanket induction for causal discovery and feature selection for classification Part I: Algorithms and empirical evaluation
    • I. Tsamardinos, C. Aliferis, and A. Statnikov Local causal and markov blanket induction for causal discovery and feature selection for classification Part I: Algorithms and empirical evaluation J. Mach. Learn. Res. 11 2010 171 234
    • (2010) J. Mach. Learn. Res. , vol.11 , pp. 171-234
    • Tsamardinos, I.1    Aliferis, C.2    Statnikov, A.3
  • 22
    • 0004493166 scopus 로고    scopus 로고
    • On the approximation of minimizing non zero variables or unsatisfied relations in linear systems
    • E. Amaldi, and V. Kann On the approximation of minimizing non zero variables or unsatisfied relations in linear systems Theor. Comput. Sci. 209 1-2 1998 237 260
    • (1998) Theor. Comput. Sci. , vol.209 , Issue.12 , pp. 237-260
    • Amaldi, E.1    Kann, V.2
  • 23
    • 33845802629 scopus 로고    scopus 로고
    • Stochastic local search for the feature set problem, with applications to microarray data
    • A.A. Albrecht Stochastic local search for the feature set problem, with applications to microarray data Appl. Math. Comput. 183 2 2006 1148 1164
    • (2006) Appl. Math. Comput. , vol.183 , Issue.2 , pp. 1148-1164
    • Albrecht, A.A.1
  • 24
    • 0036127473 scopus 로고    scopus 로고
    • Input feature selection for classification problems
    • N. Kwak, and C.H. Choi Input feature selection for classification problems IEEE Trans. Neural Netw. 13 1 2002 143 159
    • (2002) IEEE Trans. Neural Netw. , vol.13 , Issue.1 , pp. 143-159
    • Kwak, N.1    Choi, C.H.2
  • 28
    • 0031381525 scopus 로고    scopus 로고
    • Wrappers for feature subset selection
    • R. Kohavi, and G.H. John Wrappers for feature subset selection Artif. Intell. 97 1-2 1997 273 324
    • (1997) Artif. Intell. , vol.97 , Issue.12 , pp. 273-324
    • Kohavi, R.1    John, G.H.2
  • 29
    • 84960463485 scopus 로고    scopus 로고
    • Minimum redundancy feature selection from microarray gene expression data
    • IEEE Computer Society Washington, DC, USA
    • C. Ding, and H. Peng Minimum redundancy feature selection from microarray gene expression data Proceedings of the IEEE Computer Society Conference on Bioinformatics, CSB'03 2003 IEEE Computer Society Washington, DC, USA 523 528
    • (2003) Proceedings of the IEEE Computer Society Conference on Bioinformatics, CSB'03 , pp. 523-528
    • Ding, C.1    Peng, H.2
  • 31
    • 84868628924 scopus 로고    scopus 로고
    • Divergence-based feature selection for separate classes
    • Y. Zhang, S. Li, T. Wang, and Z. Zhang Divergence-based feature selection for separate classes Neurocomputing 101 2013 32 42
    • (2013) Neurocomputing , vol.101 , pp. 32-42
    • Zhang, Y.1    Li, S.2    Wang, T.3    Zhang, Z.4
  • 33
    • 84855872857 scopus 로고    scopus 로고
    • Feature subset selection with cumulate conditional mutual information minimization
    • Y. Zhang, and Z. Zhang Feature subset selection with cumulate conditional mutual information minimization Expert Syst. Appl. 39 5 2012 6078 6088
    • (2012) Expert Syst. Appl. , vol.39 , Issue.5 , pp. 6078-6088
    • Zhang, Y.1    Zhang, Z.2
  • 34
    • 84870441851 scopus 로고    scopus 로고
    • A fast clustering-based feature subset selection algorithm for high-dimensional data
    • Q. Song, J. Ni, and G. Wang A fast clustering-based feature subset selection algorithm for high-dimensional data IEEE Trans. Knowl. Data Eng. 25 1 2013 1 14
    • (2013) IEEE Trans. Knowl. Data Eng. , vol.25 , Issue.1 , pp. 1-14
    • Song, Q.1    Ni, J.2    Wang, G.3
  • 35
    • 76749129275 scopus 로고    scopus 로고
    • Supervised feature selection by clustering using conditional mutual information-based distances
    • J.M. Sotoca, and F. Pla Supervised feature selection by clustering using conditional mutual information-based distances Pattern Recogn. 43 6 2010 2068 2081
    • (2010) Pattern Recogn. , vol.43 , Issue.6 , pp. 2068-2081
    • Sotoca, J.M.1    Pla, F.2
  • 36
    • 84863403768 scopus 로고    scopus 로고
    • Conditional likelihood maximisation: A unifying framework for information theoretic feature selection
    • G. Brown, A. Pocock, M.-J. Zhao, and M. Luján Conditional likelihood maximisation: a unifying framework for information theoretic feature selection J. Mach. Learn. Res. 13 1 2012 27 66
    • (2012) J. Mach. Learn. Res. , vol.13 , Issue.1 , pp. 27-66
    • Brown, G.1    Pocock, A.2    Zhao, M.-J.3    Luján, M.4
  • 37
    • 84871288728 scopus 로고    scopus 로고
    • Comments on supervised feature selection by clustering using conditional mutual information-based distances
    • N.X. Vinha, and J. Bailey Comments on supervised feature selection by clustering using conditional mutual information-based distances Pattern Recogn. 46 4 2013 1220 1225
    • (2013) Pattern Recogn. , vol.46 , Issue.4 , pp. 1220-1225
    • Vinha, N.X.1    Bailey, J.2
  • 39
    • 33746035971 scopus 로고    scopus 로고
    • The max-min hill climbing bayesian network structure learning algorithm
    • I. Tsamardinos, L.E. Brown, and C.F. Aliferis The max-min hill climbing bayesian network structure learning algorithm Mach. Learn. 65 1 2006 31 78
    • (2006) Mach. Learn. , vol.65 , Issue.1 , pp. 31-78
    • Tsamardinos, I.1    Brown, L.E.2    Aliferis, C.F.3
  • 40
  • 41
    • 0018032112 scopus 로고
    • Measuring the efficiency of decision making units
    • A. Charnes, W.W. Cooper, and E. Rhodes Measuring the efficiency of decision making units Eur. J. Oper. Res. 2 1978 429 444
    • (1978) Eur. J. Oper. Res. , vol.2 , pp. 429-444
    • Charnes, A.1    Cooper, W.W.2    Rhodes, E.3
  • 42
    • 0001106350 scopus 로고
    • A procedure for ranking efficient units in data envelopment analysis
    • P. Andersen, and N. Petersen A procedure for ranking efficient units in data envelopment analysis Manage. Sci. 39 1993 1261 1264
    • (1993) Manage. Sci. , vol.39 , pp. 1261-1264
    • Andersen, P.1    Petersen, N.2
  • 43
    • 50849120052 scopus 로고    scopus 로고
    • Data envelopment analysis (DEA) - Thirty years on
    • W.D. Cook, and L.M. Seiford Data envelopment analysis (DEA) - thirty years on Eur. J. Oper. Res. 192 1 2009 1 17
    • (2009) Eur. J. Oper. Res. , vol.192 , Issue.1 , pp. 1-17
    • Cook, W.D.1    Seiford, L.M.2
  • 44
    • 51249181779 scopus 로고
    • A new polynomial time algorithm for linear programming
    • N. Karmarkar A new polynomial time algorithm for linear programming Combinatorica 4 4 1984 373 395
    • (1984) Combinatorica , vol.4 , Issue.4 , pp. 373-395
    • Karmarkar, N.1
  • 45
    • 0141990695 scopus 로고    scopus 로고
    • Theoretical and empirical analysis of relief and relief
    • M. Robnik-Sikonja, and I. Kononenko Theoretical and empirical analysis of relief and relief Mach. Learn. 53 2003 23 69
    • (2003) Mach. Learn. , vol.53 , pp. 23-69
    • Robnik-Sikonja, M.1    Kononenko, I.2
  • 48
    • 0025725905 scopus 로고
    • Instance-based learning algorithms
    • D. Aha, and D. Kibler Instance-based learning algorithms Mach. Learn. 6 1991 37 66
    • (1991) Mach. Learn. , vol.6 , pp. 37-66
    • Aha, D.1    Kibler, D.2
  • 50
    • 78049530902 scopus 로고    scopus 로고
    • An integrated data envelopment analysis-artificial neural network-rough set algorithm for assessment of personnel efficiency
    • A. Azadeha, M. Saberi, R.T. Moghaddam, and L. Javanmardi An integrated data envelopment analysis-artificial neural network-rough set algorithm for assessment of personnel efficiency Expert Syst. Appl. 38 3 2011 1364 1373
    • (2011) Expert Syst. Appl. , vol.38 , Issue.3 , pp. 1364-1373
    • Azadeha, A.1    Saberi, M.2    Moghaddam, R.T.3    Javanmardi, L.4
  • 53
    • 0032640526 scopus 로고    scopus 로고
    • An enhanced DEA Russell graph efficiency measure
    • J.T. Pastor, J.L. Ruiz, and I. Sirvent An enhanced DEA Russell graph efficiency measure Eur. J. Oper. Res. 115 3 1999 596 607
    • (1999) Eur. J. Oper. Res. , vol.115 , Issue.3 , pp. 596-607
    • Pastor, J.T.1    Ruiz, J.L.2    Sirvent, I.3
  • 54
    • 0043138863 scopus 로고    scopus 로고
    • Ram: A range adjusted measure of inefficiency for use with additive models, and relations to other models and measure in DEA
    • W.W. Cooper, K.S. Park, and J.T. Pastor Ram: a range adjusted measure of inefficiency for use with additive models, and relations to other models and measure in DEA J. Prod. Anal. 11 1999 5 42
    • (1999) J. Prod. Anal. , vol.11 , pp. 5-42
    • Cooper, W.W.1    Park, K.S.2    Pastor, J.T.3
  • 55
    • 79952192341 scopus 로고    scopus 로고
    • Bam: A bounded adjusted measure of efficiency for use with bounded additive models
    • W.W. Cooper, J.T. Pastor, F. Borras, J. Aparicio, and D. Pastor Bam: a bounded adjusted measure of efficiency for use with bounded additive models J. Prod. Anal. 35 2 2011 85 94
    • (2011) J. Prod. Anal. , vol.35 , Issue.2 , pp. 85-94
    • Cooper, W.W.1    Pastor, J.T.2    Borras, F.3    Aparicio, J.4    Pastor, D.5


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.