-
1
-
-
84894547374
-
Quality of information-based source assessment and selection
-
Lin Y., Hu X., Wu X. Quality of information-based source assessment and selection. Neurocomputing 2014, 133:95-102.
-
(2014)
Neurocomputing
, vol.133
, pp. 95-102
-
-
Lin, Y.1
Hu, X.2
Wu, X.3
-
2
-
-
84868655354
-
Mutual information based input feature selection for classification problems
-
Cang S., Yu H. Mutual information based input feature selection for classification problems. Decis. Support Syst. 2012, 54:691-698.
-
(2012)
Decis. Support Syst.
, vol.54
, pp. 691-698
-
-
Cang, S.1
Yu, H.2
-
3
-
-
33745561205
-
An introduction to variable and features election
-
Guyon I., Elisseeff A. An introduction to variable and features election. J. Mach. Learn. Res. 2003, 3:1157-1182.
-
(2003)
J. Mach. Learn. Res.
, vol.3
, pp. 1157-1182
-
-
Guyon, I.1
Elisseeff, A.2
-
4
-
-
17044405923
-
Toward integrating features election algorithms for classification and clustering
-
Liu H., Yu L. Toward integrating features election algorithms for classification and clustering. IEEE Trans. Knowl. Data Eng. 2005, 17(4):491-502.
-
(2005)
IEEE Trans. Knowl. Data Eng.
, vol.17
, Issue.4
, pp. 491-502
-
-
Liu, H.1
Yu, L.2
-
5
-
-
0033636139
-
Support vector machine classification and validation of cancer tissue samples using microarray expression data
-
Furey T.S., Cristianini N., Duffy N., Bednarski D.W., Schummer M., Haussler D. Support vector machine classification and validation of cancer tissue samples using microarray expression data. Bioinformatics 2000, 16(10):906-914.
-
(2000)
Bioinformatics
, vol.16
, Issue.10
, pp. 906-914
-
-
Furey, T.S.1
Cristianini, N.2
Duffy, N.3
Bednarski, D.W.4
Schummer, M.5
Haussler, D.6
-
6
-
-
27644496932
-
A new dependency and correlation analysis for features
-
Qu G., Hariri S., Yousif M. A new dependency and correlation analysis for features. IEEE Trans. Knowl. Data Eng. 2005, 17(9):1199-1207.
-
(2005)
IEEE Trans. Knowl. Data Eng.
, vol.17
, Issue.9
, pp. 1199-1207
-
-
Qu, G.1
Hariri, S.2
Yousif, M.3
-
7
-
-
24344458137
-
Feature selection based on mutual information. criteria of max-dependency, max-relevance, and min-redundancy
-
Peng H., Long F., Ding C. Feature selection based on mutual information. criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27(8):1226-1238.
-
(2005)
IEEE Trans. Pattern Anal. Mach. Intell.
, vol.27
, Issue.8
, pp. 1226-1238
-
-
Peng, H.1
Long, F.2
Ding, C.3
-
8
-
-
12144251725
-
Effective feature selection scheme using mutual information
-
Huang D., Chow T.W.S. Effective feature selection scheme using mutual information. Neurocomputing 2005, 63:325-343.
-
(2005)
Neurocomputing
, vol.63
, pp. 325-343
-
-
Huang, D.1
Chow, T.W.S.2
-
9
-
-
40649115462
-
A parameterless feature ranking algorithm based on mi
-
Huang J.J., Cai Y.Z., Xu X.M. A parameterless feature ranking algorithm based on mi. Neurocomputing 2008, 71:1656-1668.
-
(2008)
Neurocomputing
, vol.71
, pp. 1656-1668
-
-
Huang, J.J.1
Cai, Y.Z.2
Xu, X.M.3
-
10
-
-
25144492516
-
Efficient feature selection via analysis of relevance and redundancy
-
Yu L., Liu H. Efficient feature selection via analysis of relevance and redundancy. J. Mach. Learn. Res. 2004, 5:1205-1224.
-
(2004)
J. Mach. Learn. Res.
, vol.5
, pp. 1205-1224
-
-
Yu, L.1
Liu, H.2
-
11
-
-
84862024860
-
Feature selection via dependence maximization
-
Song L., Smola A., Gretton A., Bedo J., Borgwardt K. Feature selection via dependence maximization. J. Mach. Learn. Res. 2012, 13:1393-1434.
-
(2012)
J. Mach. Learn. Res.
, vol.13
, pp. 1393-1434
-
-
Song, L.1
Smola, A.2
Gretton, A.3
Bedo, J.4
Borgwardt, K.5
-
12
-
-
84877634882
-
Theoretical and empirical study on the potential inadequacy of mutual information for feature selection in classification
-
Frénay B., Doquire G., Verleysen M. Theoretical and empirical study on the potential inadequacy of mutual information for feature selection in classification. Neurocomputing 2013, 112:64-78.
-
(2013)
Neurocomputing
, vol.112
, pp. 64-78
-
-
Frénay, B.1
Doquire, G.2
Verleysen, M.3
-
13
-
-
34547981441
-
Spectral feature selection for supervised and unsupervised learning
-
ICML'07, ACM Press, New York, NY, USA
-
Z. Zhao, H. Liu, Spectral feature selection for supervised and unsupervised learning, in: Proceedings of the 24th International Conference on Machine learning, ICML'07, ACM Press, New York, NY, USA, 2007, pp. 1151-1157.
-
(2007)
Proceedings of the 24th International Conference on Machine learning
, pp. 1151-1157
-
-
Zhao, Z.1
Liu, H.2
-
14
-
-
34547964410
-
Supervised feature selection via dependence estimation
-
ICML'07, ACM Press, New York, NY, USA
-
L. Song, A.J. Smola, K.M. Borgwardt, J. Bedo, Supervised feature selection via dependence estimation, in: Proceedings of the 24th International Conference on Machine learning, ICML'07, ACM Press, New York, NY, USA, 2007, pp. 823-830.
-
(2007)
Proceedings of the 24th International Conference on Machine learning
, pp. 823-830
-
-
Song, L.1
Smola, A.J.2
Borgwardt, K.M.3
Bedo, J.4
-
15
-
-
85065703189
-
Correlation-based feature selection for discrete and numeric class machine learning
-
ICML'00, Morgan Kaufmann, Los Altos, CA, USA
-
M.A. Hall, Correlation-based feature selection for discrete and numeric class machine learning, in: Proceedings of the Seventh International Conference on Machine Learning, ICML'00, Morgan Kaufmann, Los Altos, CA, USA, 2000, pp. 359-366.
-
(2000)
Proceedings of the Seventh International Conference on Machine Learning
, pp. 359-366
-
-
Hall, M.A.1
-
16
-
-
0004493166
-
On the approximation of minimizing non zero variables or unsatisfied relations in linear systems
-
Amaldi E., Kann V. On the approximation of minimizing non zero variables or unsatisfied relations in linear systems. Theor. Comput. Sci. 1998, 209:237-260.
-
(1998)
Theor. Comput. Sci.
, vol.209
, pp. 237-260
-
-
Amaldi, E.1
Kann, V.2
-
17
-
-
33845802629
-
Stochastic local search for the feature set problem, with applications to microarray data
-
Albrecht A.A. Stochastic local search for the feature set problem, with applications to microarray data. Appl. Math. Comput. 2006, 183(2):1148-1164.
-
(2006)
Appl. Math. Comput.
, vol.183
, Issue.2
, pp. 1148-1164
-
-
Albrecht, A.A.1
-
18
-
-
0028468293
-
Using mi for selecting features in supervised neural net learning
-
Battiti R. Using mi for selecting features in supervised neural net learning. IEEE Trans. Neural Netw. 1994, 5(4):537-550.
-
(1994)
IEEE Trans. Neural Netw.
, vol.5
, Issue.4
, pp. 537-550
-
-
Battiti, R.1
-
19
-
-
0036127473
-
Input feature selection for classification problems
-
Kwak N., Choi C.H. Input feature selection for classification problems. IEEE Trans. Neural Netw. 2002, 13(1):143-159.
-
(2002)
IEEE Trans. Neural Netw.
, vol.13
, Issue.1
, pp. 143-159
-
-
Kwak, N.1
Choi, C.H.2
-
20
-
-
33645690579
-
Fast binary feature selection with conditional mutual information
-
Fleuret F. Fast binary feature selection with conditional mutual information. J. Mach. Learn. Res. 2004, 5:1531-1555.
-
(2004)
J. Mach. Learn. Res.
, vol.5
, pp. 1531-1555
-
-
Fleuret, F.1
-
21
-
-
85146422424
-
Apractical approach to feature selection
-
ML'92, Morgan Kaufmann, San Francisco, CA, USA
-
K. Kira, L. Rendell, Apractical approach to feature selection, in: Proceedings of the Ninth International Workshop on Machine Learning, ML'92, Morgan Kaufmann, San Francisco, CA, USA, 1992, pp. 249-256.
-
(1992)
Proceedings of the Ninth International Workshop on Machine Learning
, pp. 249-256
-
-
Kira, K.1
Rendell, L.2
-
22
-
-
84992726552
-
Estimating attributes: analysis and extensions of relief
-
ECML'94, Springer-Verlag New York, Inc., Secaucus, NJ, USA
-
I. Kononenko, Estimating attributes: analysis and extensions of relief, in: Proceedings of European Conference on Machine Learning, ECML'94, Springer-Verlag New York, Inc., Secaucus, NJ, USA, 1994, pp. 171-182.
-
(1994)
Proceedings of European Conference on Machine Learning
, pp. 171-182
-
-
Kononenko, I.1
-
23
-
-
0002312061
-
Feature selection and feature extraction for text categorization
-
Association for Computational Linguistics Morristown, NJ, USA
-
D.D. Lewis, Feature selection and feature extraction for text categorization, in: Proceedings of the Workshop on Speech and Natural Language, Association for Computational Linguistics Morristown, NJ, USA, 1992, pp. 212-217.
-
(1992)
Proceedings of the Workshop on Speech and Natural Language
, pp. 212-217
-
-
Lewis, D.D.1
-
24
-
-
84901192462
-
Feature selection using data envelopment analysis
-
Zhang Y., Yang A., Xiong C., Zhang Z. Feature selection using data envelopment analysis. Knowl.-Based Syst. 2014, 64:70-80.
-
(2014)
Knowl.-Based Syst.
, vol.64
, pp. 70-80
-
-
Zhang, Y.1
Yang, A.2
Xiong, C.3
Zhang, Z.4
-
25
-
-
0038797944
-
-
John Wiley & Sons Ltd, Chichester, West Sussex, England
-
Webb A.R. Statistical Pattern Recognition 2002, John Wiley & Sons Ltd, Chichester, West Sussex, England. 2nd edition.
-
(2002)
Statistical Pattern Recognition
-
-
Webb, A.R.1
-
26
-
-
0000012317
-
Toward optimal feature selection
-
ICML'96, Morgan Kaufmann, Los Altos, CA, USA
-
D. Koller, M. Sahami, Toward optimal feature selection, in: Proceedings of International Conference on Machine Learning, ICML'96, Morgan Kaufmann, Los Altos, CA, USA, 1996, pp. 284-292.
-
(1996)
Proceedings of International Conference on Machine Learning
, pp. 284-292
-
-
Koller, D.1
Sahami, M.2
-
27
-
-
0031381525
-
Wrappers for feature subset selection
-
Kohavi R., John G.H. Wrappers for feature subset selection. Artif. Intell. 1997, 97:273-324.
-
(1997)
Artif. Intell.
, vol.97
, pp. 273-324
-
-
Kohavi, R.1
John, G.H.2
-
28
-
-
84960463485
-
Minimum redundancy feature selection from microarray gene expression data
-
CSB'03, IEEE Computer Society, Washington, DC, USA
-
C. Ding, H. Peng, Minimum redundancy feature selection from microarray gene expression data, in: Proceedings of the IEEE Computer Society Conference on Bioinformatics, CSB'03, IEEE Computer Society, Washington, DC, USA, 2003, pp. 523-528.
-
(2003)
Proceedings of the IEEE Computer Society Conference on Bioinformatics
, pp. 523-528
-
-
Ding, C.1
Peng, H.2
-
29
-
-
84868628924
-
Divergence-based feature selection for separate classes
-
Zhang Y., Li S., Wang T., Zhang Z. Divergence-based feature selection for separate classes. Neurocomputing 2013, 101:32-42.
-
(2013)
Neurocomputing
, vol.101
, pp. 32-42
-
-
Zhang, Y.1
Li, S.2
Wang, T.3
Zhang, Z.4
-
30
-
-
84942213019
-
The best two independent measurements are not the two best
-
Cover T.M. The best two independent measurements are not the two best. IEEE Trans. Syst. Man Cybern. 1974, 4:116-117.
-
(1974)
IEEE Trans. Syst. Man Cybern.
, vol.4
, pp. 116-117
-
-
Cover, T.M.1
-
32
-
-
84870441851
-
A fast clustering-based feature subset selection algorithm for high-dimensional data
-
Song Q., Ni J., Wang G. A fast clustering-based feature subset selection algorithm for high-dimensional data. IEEE Trans. Knowl. Data Eng. 2013, 25(1):1-14.
-
(2013)
IEEE Trans. Knowl. Data Eng.
, vol.25
, Issue.1
, pp. 1-14
-
-
Song, Q.1
Ni, J.2
Wang, G.3
-
33
-
-
18744400819
-
Feature selection with conditional mutual information maximin in text categorization
-
CIKM'04, ACM Press, New York, NY, USA
-
G. Wang, F.H. Lochovsky, Q. Yang, Feature selection with conditional mutual information maximin in text categorization, in: Proceedings of the 19th ACM International Conference on Information and Knowledge Management, CIKM'04, ACM Press, New York, NY, USA, 2004, pp. 342-349.
-
(2004)
Proceedings of the 19th ACM International Conference on Information and Knowledge Management
, pp. 342-349
-
-
Wang, G.1
Lochovsky, F.H.2
Yang, Q.3
-
34
-
-
76749129275
-
Supervised feature selection by clustering using conditional mutual information-based distances
-
Sotoca J.M., Pla F. Supervised feature selection by clustering using conditional mutual information-based distances. Pattern Recognit. 2010, 43(6):2068-2081.
-
(2010)
Pattern Recognit.
, vol.43
, Issue.6
, pp. 2068-2081
-
-
Sotoca, J.M.1
Pla, F.2
-
35
-
-
84855872857
-
Feature subset selection with cumulate conditional mutual information minimization
-
Zhang Y., Zhang Z. Feature subset selection with cumulate conditional mutual information minimization. Expert Syst. Appl. 2012, 39:6078-6088.
-
(2012)
Expert Syst. Appl.
, vol.39
, pp. 6078-6088
-
-
Zhang, Y.1
Zhang, Z.2
-
37
-
-
33745799421
-
On the use of variable complementarity for feature selection in cancer classification
-
Meyer P., Bontempi G. On the use of variable complementarity for feature selection in cancer classification. Evol. Comput. Mach. Learn. Bioinform. 2006, 3907:91-102.
-
(2006)
Evol. Comput. Mach. Learn. Bioinform.
, vol.3907
, pp. 91-102
-
-
Meyer, P.1
Bontempi, G.2
-
38
-
-
84863403768
-
Conditional likelihood maximisation. a unifying framework for information theoretic feature selection
-
Brown G., Pocock A., Zhao M.-J., Luján M. Conditional likelihood maximisation. a unifying framework for information theoretic feature selection. J. Mach. Learn. Res. 2012, 13:27-66.
-
(2012)
J. Mach. Learn. Res.
, vol.13
, pp. 27-66
-
-
Brown, G.1
Pocock, A.2
Zhao, M.-J.3
Luján, M.4
-
39
-
-
50849120052
-
Data envelopment analysis (DEA)-thirty years on
-
Cooka W.D., Seiford L.M. Data envelopment analysis (DEA)-thirty years on. Eur. J. Oper. Res. 2009, 192:1-17.
-
(2009)
Eur. J. Oper. Res.
, vol.192
, pp. 1-17
-
-
Cooka, W.D.1
Seiford, L.M.2
-
40
-
-
61349101813
-
Constructing ensembles from data envelopment analysis
-
Zheng Z., Padmanabhan B. Constructing ensembles from data envelopment analysis. INFORMS J. Comput. 2007, 19(4):486-496.
-
(2007)
INFORMS J. Comput.
, vol.19
, Issue.4
, pp. 486-496
-
-
Zheng, Z.1
Padmanabhan, B.2
-
41
-
-
80052024566
-
Data envelopment analysis classification machine
-
Yan H., Wei Q. Data envelopment analysis classification machine. Inf. Sci. 2011, 181:5029-5041.
-
(2011)
Inf. Sci.
, vol.181
, pp. 5029-5041
-
-
Yan, H.1
Wei, Q.2
-
42
-
-
84894381647
-
Interactive classification using data envelopment analysis
-
Pendharkar P.C., Troutt M.D. Interactive classification using data envelopment analysis. Ann. Oper. Res. 2014, 214:125-141.
-
(2014)
Ann. Oper. Res.
, vol.214
, pp. 125-141
-
-
Pendharkar, P.C.1
Troutt, M.D.2
-
43
-
-
84859910730
-
Fuzzy classification using the data envelopment analysis
-
Pendharkar P. Fuzzy classification using the data envelopment analysis. Knowl.-Based Syst. 2012, 31:183-192.
-
(2012)
Knowl.-Based Syst.
, vol.31
, pp. 183-192
-
-
Pendharkar, P.1
-
45
-
-
1642397083
-
Algorithms for large scale Markov blanket discovery
-
FLAIRS'03, AAAI Press, Menlo Park, CA, USA
-
I. Tsamardinos, C.F. Aliferis, A. Statnikov, Algorithms for large scale Markov blanket discovery, in: Proceedings of the 16th International Florida Artificial Intelligence Research Society Conference, FLAIRS'03, AAAI Press, Menlo Park, CA, USA, 2003, pp. 376-381.
-
(2003)
Proceedings of the 16th International Florida Artificial Intelligence Research Society Conference
, pp. 376-381
-
-
Tsamardinos, I.1
Aliferis, C.F.2
Statnikov, A.3
-
46
-
-
76749137632
-
Local causal and Markov blanket induction for causal discovery and feature selection for classification. Part I. Algorithms and empirical evaluation
-
Tsamardinos I., Aliferis C., Statnikov A. Local causal and Markov blanket induction for causal discovery and feature selection for classification. Part I. Algorithms and empirical evaluation. J. Mach. Learn. Res. 2010, 11:171-234.
-
(2010)
J. Mach. Learn. Res.
, vol.11
, pp. 171-234
-
-
Tsamardinos, I.1
Aliferis, C.2
Statnikov, A.3
-
47
-
-
51249181779
-
A new polynomial time algorithm for linear programming
-
Karmarkar N. A new polynomial time algorithm for linear programming. Combinatorica 1984, 4:373-395.
-
(1984)
Combinatorica
, vol.4
, pp. 373-395
-
-
Karmarkar, N.1
-
51
-
-
0025725905
-
Instance-based learning algorithms
-
Aha D., Kibler D. Instance-based learning algorithms. Mach. Learn. 1991, 6:37-66.
-
(1991)
Mach. Learn.
, vol.6
, pp. 37-66
-
-
Aha, D.1
Kibler, D.2
-
52
-
-
0003957032
-
-
Morgan Kaufmann, San Francisco, CA, USA
-
Witten H.I., Frank E. Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations 2000, Morgan Kaufmann, San Francisco, CA, USA.
-
(2000)
Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations
-
-
Witten, H.I.1
Frank, E.2
-
53
-
-
0141990695
-
Theoretical and empirical analysis of Relief and ReliefF
-
Robnik-Sikonja M., Kononenko I. Theoretical and empirical analysis of Relief and ReliefF. Mach. Learn. 2003, 53:23-69.
-
(2003)
Mach. Learn.
, vol.53
, pp. 23-69
-
-
Robnik-Sikonja, M.1
Kononenko, I.2
-
54
-
-
84891424949
-
-
Springer, New York, USA
-
Cooper W.W., Seiford L.M., Tone K. Data Envelopment Analysis: A Comprehensive Text with Models, Applications, References, and DEA-Solver Software 2007, Springer, New York, USA.
-
(2007)
Data Envelopment Analysis: A Comprehensive Text with Models, Applications, References, and DEA-Solver Software
-
-
Cooper, W.W.1
Seiford, L.M.2
Tone, K.3
-
55
-
-
0032640526
-
An enhanced DEA Russell graph efficiency measure
-
Pastor J.T., Ruiz J.L., Sirvent I. An enhanced DEA Russell graph efficiency measure. Eur. J. Oper. Res. 1999, 115(3):596-607.
-
(1999)
Eur. J. Oper. Res.
, vol.115
, Issue.3
, pp. 596-607
-
-
Pastor, J.T.1
Ruiz, J.L.2
Sirvent, I.3
-
56
-
-
0043138863
-
Ram. a range adjusted measure of inefficiency for use with additive models, and relations to other models and measure in DEA
-
Cooper W.W., Park K.S., Pastor J.T. Ram. a range adjusted measure of inefficiency for use with additive models, and relations to other models and measure in DEA. J. Product. Anal. 1999, 11:5-42.
-
(1999)
J. Product. Anal.
, vol.11
, pp. 5-42
-
-
Cooper, W.W.1
Park, K.S.2
Pastor, J.T.3
-
57
-
-
79952192341
-
BAM. a bounded adjusted measure of efficiency for use with bounded additive models
-
Cooper W.W., Pastor J.T., Borras F., Aparicio J., Pastor D. BAM. a bounded adjusted measure of efficiency for use with bounded additive models. J. Product. Anal. 2011, 35(2):85-94.
-
(2011)
J. Product. Anal.
, vol.35
, Issue.2
, pp. 85-94
-
-
Cooper, W.W.1
Pastor, J.T.2
Borras, F.3
Aparicio, J.4
Pastor, D.5
|