-
1
-
-
84894418233
-
Feature selection with SVD entropy: some modification and extension
-
Banerjee, M., Pal, N.R., Feature selection with SVD entropy: some modification and extension. Inf. Sci. 264 (2014), 118–134.
-
(2014)
Inf. Sci.
, vol.264
, pp. 118-134
-
-
Banerjee, M.1
Pal, N.R.2
-
2
-
-
0028468293
-
Using mutual information for selecting features in supervised neural net learning
-
Battiti, R., Using mutual information for selecting features in supervised neural net learning. IEEE Trans. Neural Netw. 5:4 (1994), 537–550.
-
(1994)
IEEE Trans. Neural Netw.
, vol.5
, Issue.4
, pp. 537-550
-
-
Battiti, R.1
-
3
-
-
84940462014
-
Feature selection using joint mutual information maximisation
-
Bennasar, M., Hicks, Y., Setchi, R., Feature selection using joint mutual information maximisation. Expert Syst. Appl. 42 (2015), 8520–8532.
-
(2015)
Expert Syst. Appl.
, vol.42
, pp. 8520-8532
-
-
Bennasar, M.1
Hicks, Y.2
Setchi, R.3
-
4
-
-
84863403768
-
Conditional likelihood maximisation: a unifying framework for information theoretic feature selection
-
Brown, G., Pocock, A., et al. Conditional likelihood maximisation: a unifying framework for information theoretic feature selection. J. Mach. Learn. Res. 13 (2012), 27–66.
-
(2012)
J. Mach. Learn. Res.
, vol.13
, pp. 27-66
-
-
Brown, G.1
Pocock, A.2
-
5
-
-
84901935441
-
A comparison of simulated annealing algorithms for variable selection in principal component analysis and discriminant analysis
-
Brusco, M.J., A comparison of simulated annealing algorithms for variable selection in principal component analysis and discriminant analysis. Comput. Stat. Data Anal. 77 (2014), 38–53.
-
(2014)
Comput. Stat. Data Anal.
, vol.77
, pp. 38-53
-
-
Brusco, M.J.1
-
6
-
-
40949142799
-
Selecting useful groups of features in a connectionist framework
-
Chakraborty, D., Pal, N.R., Selecting useful groups of features in a connectionist framework. IEEE Trans. Neural Netw. 19:3 (2008), 381–396.
-
(2008)
IEEE Trans. Neural Netw.
, vol.19
, Issue.3
, pp. 381-396
-
-
Chakraborty, D.1
Pal, N.R.2
-
7
-
-
84919630495
-
Feature selection using a neural framework with controlled redundancy
-
Chakraborty, R., Pal, N.R., Feature selection using a neural framework with controlled redundancy. IEEE Trans. Neural Netw. Learn.Syst. 26:1 (2015), 35–50.
-
(2015)
IEEE Trans. Neural Netw. Learn.Syst.
, vol.26
, Issue.1
, pp. 35-50
-
-
Chakraborty, R.1
Pal, N.R.2
-
8
-
-
84921600811
-
Variable selection via a multi-stage strategy
-
Chang, J., Lee, H.K.H., Variable selection via a multi-stage strategy. J. Appl. Stat. 42:4 (2015), 762–774.
-
(2015)
J. Appl. Stat.
, vol.42
, Issue.4
, pp. 762-774
-
-
Chang, J.1
Lee, H.K.H.2
-
9
-
-
84982162233
-
Stochastic correlation coefficient ensembles for variable selection
-
Che, J.X., Yang, Y.L., Stochastic correlation coefficient ensembles for variable selection. J. Appl. Stat., 2016, 10.1080/02664763.2016.1221913.
-
(2016)
J. Appl. Stat.
-
-
Che, J.X.1
Yang, Y.L.2
-
10
-
-
84907558877
-
Maximum relevancy maximum complementary feature selection for multi-sensor activity recognition
-
Chernbumroong, S., Cang, S., Yu, H., Maximum relevancy maximum complementary feature selection for multi-sensor activity recognition. Expert Syst. Appl. 42 (2015), 573–583.
-
(2015)
Expert Syst. Appl.
, vol.42
, pp. 573-583
-
-
Chernbumroong, S.1
Cang, S.2
Yu, H.3
-
11
-
-
84870288271
-
BART: Bayesian additive regression trees
-
Chipman, H.A., George, E.I., McCulloch, R.E., BART: Bayesian additive regression trees. Ann. Appl. Stat. 4 (2010), 266–298.
-
(2010)
Ann. Appl. Stat.
, vol.4
, pp. 266-298
-
-
Chipman, H.A.1
George, E.I.2
McCulloch, R.E.3
-
12
-
-
85019212060
-
-
Bayestree: Bayesian methods for tree based models, R Package Version 0.3-1.1. Available at.
-
H. Chipman, R. McCulloch, Bayestree: Bayesian methods for tree based models, 2010, R Package Version 0.3-1.1. Available at http://www.cran.r-project.org/web/packages/BayesTree/index.html.
-
(2010)
-
-
Chipman, H.1
McCulloch, R.2
-
13
-
-
84889281816
-
Elements of Information Theory
-
John Wiley and Sons, Inc. New York, NY, USA
-
Cover, T.M., Thomas, J.A., Elements of Information Theory. 1991, John Wiley and Sons, Inc., New York, NY, USA.
-
(1991)
-
-
Cover, T.M.1
Thomas, J.A.2
-
15
-
-
84901814060
-
Hierarchical estimation algorithms for multivariable systems using measurement information
-
Ding, F., Hierarchical estimation algorithms for multivariable systems using measurement information. Inf. Sci. 277:2 (2014), 396–405.
-
(2014)
Inf. Sci.
, vol.277
, Issue.2
, pp. 396-405
-
-
Ding, F.1
-
16
-
-
85009810565
-
Parameter estimation for pseudo-linear systems using the auxiliary model and the decomposition technique
-
Ding, F., Wang, F., Xu, L., Hayat, T., Alsaedi, A., Parameter estimation for pseudo-linear systems using the auxiliary model and the decomposition technique. IET Control Theory Appl. 11 (2017), 390–400.
-
(2017)
IET Control Theory Appl.
, vol.11
, pp. 390-400
-
-
Ding, F.1
Wang, F.2
Xu, L.3
Hayat, T.4
Alsaedi, A.5
-
17
-
-
85008186357
-
Decomposition based least squares iterative identification algorithm for multivariate pseudo-linear ARMA systems using the data filtering
-
Ding, F., Wang, F., Xu, L., Wu, M., Decomposition based least squares iterative identification algorithm for multivariate pseudo-linear ARMA systems using the data filtering. J. Franklin Inst. 354 (2017), 1321–1339.
-
(2017)
J. Franklin Inst.
, vol.354
, pp. 1321-1339
-
-
Ding, F.1
Wang, F.2
Xu, L.3
Wu, M.4
-
18
-
-
85006797844
-
Performance analysis of the generalised projection identification for time-varying systems
-
Ding, F., Xu, L., Zhu, Q., Performance analysis of the generalised projection identification for time-varying systems. IET Control Theory Appl. 10 (2016), 2506–2514.
-
(2016)
IET Control Theory Appl.
, vol.10
, pp. 2506-2514
-
-
Ding, F.1
Xu, L.2
Zhu, Q.3
-
19
-
-
84946490956
-
Binary grey wolf optimization approaches for feature selection
-
Emary, E., Zawbaa, H.M., Hassanien, A.E., Binary grey wolf optimization approaches for feature selection. Neurocomputing 172 (2016), 371–381.
-
(2016)
Neurocomputing
, vol.172
, pp. 371-381
-
-
Emary, E.1
Zawbaa, H.M.2
Hassanien, A.E.3
-
20
-
-
60849097547
-
Normalized mutual information feature selection
-
Estévez, P.A., Normalized mutual information feature selection. IEEE Trans. Neural Netw. 20:2 (2009), 189–201.
-
(2009)
IEEE Trans. Neural Netw.
, vol.20
, Issue.2
, pp. 189-201
-
-
Estévez, P.A.1
-
21
-
-
33645690579
-
Fast binary feature selection with conditional mutual information
-
Fleuret, F., Fast binary feature selection with conditional mutual information. J. Mach. Learn. Res. 5 (2004), 1531–1555.
-
(2004)
J. Mach. Learn. Res.
, vol.5
, pp. 1531-1555
-
-
Fleuret, F.1
-
22
-
-
84921784324
-
An evaluation of classifier-specific filter measure performance for feature selection
-
Freeman, C., Kulić, D., Basir, O., An evaluation of classifier-specific filter measure performance for feature selection. Pattern Recognit. 48 (2015), 1812–1826.
-
(2015)
Pattern Recognit.
, vol.48
, pp. 1812-1826
-
-
Freeman, C.1
Kulić, D.2
Basir, O.3
-
23
-
-
84943748390
-
High-dimensional feature selection via feature grouping: a variable neighborhood search approach
-
García-Torres, M., Gómez-Vela, F., Melián-Batista, B., Moreno-Vega, J.M., High-dimensional feature selection via feature grouping: a variable neighborhood search approach. Inf. Sci. 326 (2016), 102–118.
-
(2016)
Inf. Sci.
, vol.326
, pp. 102-118
-
-
García-Torres, M.1
Gómez-Vela, F.2
Melián-Batista, B.3
Moreno-Vega, J.M.4
-
24
-
-
49649128701
-
tgp: An r package for bayesian nonstationary, semiparametric nonlinear regression and design by treed gaussian process models
-
Gramacy, R.B., tgp: An r package for bayesian nonstationary, semiparametric nonlinear regression and design by treed gaussian process models. J. Stat. Softw. 19 (2007), 1–46.
-
(2007)
J. Stat. Softw.
, vol.19
, pp. 1-46
-
-
Gramacy, R.B.1
-
25
-
-
68949161935
-
Entropy inference and the james-stein estimator, with application to nonlinear gene association networks
-
See website for package:
-
Hausser, J., Strimmer, K., Entropy inference and the james-stein estimator, with application to nonlinear gene association networks. J. Mach. Learn. Res. 10 (2009), 1469–1484 See website for package: http://www.strimmerlab.org/software/entropy/.
-
(2009)
J. Mach. Learn. Res.
, vol.10
, pp. 1469-1484
-
-
Hausser, J.1
Strimmer, K.2
-
26
-
-
12144251725
-
Effective feature selection scheme using mutual information
-
Huang, D., Chow, T.W.S., Effective feature selection scheme using mutual information. Neurocomputing 63 (2005), 325–343.
-
(2005)
Neurocomputing
, vol.63
, pp. 325-343
-
-
Huang, D.1
Chow, T.W.S.2
-
27
-
-
85099325734
-
Irrelevant feature and the subset selection problem, Proceedings of the Eleventh International Conference on Machine Learning
-
G.H. John, R. Kohavi, K. Pfleger, Irrelevant feature and the subset selection problem, Proceedings of the Eleventh International Conference on Machine Learning1994, 121–129.
-
(1994)
, pp. 121-129
-
-
John, G.H.1
Kohavi, R.2
Pfleger, K.3
-
28
-
-
85019216595
-
Adaptive region boosting method with biased entropy for path planning in changing environment
-
Kang, R., Zhang, T., Tang, H., Zhao, W., Adaptive region boosting method with biased entropy for path planning in changing environment. CAAI Trans. Intell. Technol. 1 (2016), 179–188.
-
(2016)
CAAI Trans. Intell. Technol.
, vol.1
, pp. 179-188
-
-
Kang, R.1
Zhang, T.2
Tang, H.3
Zhao, W.4
-
29
-
-
84855888827
-
Kernel discriminant analysis for regression problems
-
Kwak, N., Kernel discriminant analysis for regression problems. Pattern Recognit. 45 (2012), 2019–2031.
-
(2012)
Pattern Recognit.
, vol.45
, pp. 2019-2031
-
-
Kwak, N.1
-
30
-
-
0036127473
-
Input feature selection for classification problems
-
Kwak, N., Choi, C.H., Input feature selection for classification problems. IEEE Trans. Neural Netw. 13:1 (2002), 143–159.
-
(2002)
IEEE Trans. Neural Netw.
, vol.13
, Issue.1
, pp. 143-159
-
-
Kwak, N.1
Choi, C.H.2
-
32
-
-
84994030188
-
Distributed filtering for discrete-time linear systems with fading measurements and time-correlated noise
-
Li, W., Jia, Y., Du, J., Distributed filtering for discrete-time linear systems with fading measurements and time-correlated noise. Digital Signal Process. 60 (2017), 211–219.
-
(2017)
Digital Signal Process.
, vol.60
, pp. 211-219
-
-
Li, W.1
Jia, Y.2
Du, J.3
-
33
-
-
0004272772
-
Information Theory, Inference, and Learning Algorithms
-
Cambridge University Press Cambridge
-
MacKay, D.J.C., Information Theory, Inference, and Learning Algorithms. 2003, Cambridge University Press, Cambridge.
-
(2003)
-
-
MacKay, D.J.C.1
-
34
-
-
84940054522
-
Kernel penalized k-means: a feature selection method based on kernel k-means
-
Maldonado, S., Carrizosa, E., Weber, R., Kernel penalized k-means: a feature selection method based on kernel k-means. Inf. Sci. 322 (2015), 150–160.
-
(2015)
Inf. Sci.
, vol.322
, pp. 150-160
-
-
Maldonado, S.1
Carrizosa, E.2
Weber, R.3
-
35
-
-
33745799421
-
On the use of variable complementarity for feature selection in cancer classification
-
Evo Workshops
-
Meyer, P.E., Bontempi, G., On the use of variable complementarity for feature selection in cancer classification. Proceedings of European Workshop on Applications of Evolutionary Computing, 2006, Evo Workshops, 91–102.
-
(2006)
Proceedings of European Workshop on Applications of Evolutionary Computing
, pp. 91-102
-
-
Meyer, P.E.1
Bontempi, G.2
-
36
-
-
47049102021
-
Information-theoretic feature selection in microarray data using variable complementarity
-
Meyer, P.E., Schretter, C., Bontempi, G., Information-theoretic feature selection in microarray data using variable complementarity. IEEE J. Sel. Top. Signal Process. 2 (2008), 261–274.
-
(2008)
IEEE J. Sel. Top. Signal Process.
, vol.2
, pp. 261-274
-
-
Meyer, P.E.1
Schretter, C.2
Bontempi, G.3
-
37
-
-
85027938740
-
A group VISA algorithm for variable selection
-
Mkhadri, A., Ouhourane, M., A group VISA algorithm for variable selection. Stat. Methods Appl. 24 (2015), 41–60.
-
(2015)
Stat. Methods Appl.
, vol.24
, pp. 41-60
-
-
Mkhadri, A.1
Ouhourane, M.2
-
38
-
-
24344458137
-
Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy
-
Peng, H., Long, F., Ding, C., Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach.Intell. 27:8 (2005), 1226–1238.
-
(2005)
IEEE Trans. Pattern Anal. Mach.Intell.
, vol.27
, Issue.8
, pp. 1226-1238
-
-
Peng, H.1
Long, F.2
Ding, C.3
-
39
-
-
84940603932
-
Mixed integer second-order cone programming formulations for variable selection in linear regression
-
Ryuhei, M., Yuichi, T., Mixed integer second-order cone programming formulations for variable selection in linear regression. Eur. J. Oper. Res. 247:3 (2015), 721–731.
-
(2015)
Eur. J. Oper. Res.
, vol.247
, Issue.3
, pp. 721-731
-
-
Ryuhei, M.1
Yuichi, T.2
-
41
-
-
77956611003
-
mr2PSO: a maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification
-
Unler, A., Murat, A., Chinnam, R.B., mr2PSO: a maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification. Inf. Sci. 181:20 (2011), 4625–4641.
-
(2011)
Inf. Sci.
, vol.181
, Issue.20
, pp. 4625-4641
-
-
Unler, A.1
Murat, A.2
Chinnam, R.B.3
-
43
-
-
78649238560
-
An improved maximum relevance and minimum redundancy feature selection algorithm based on normalized mutual information
-
Vinh, L.T., Thang, N.D., Lee, Y.K., An improved maximum relevance and minimum redundancy feature selection algorithm based on normalized mutual information. Tenth International Symposium on Applications and the Internet, 2010, 395–398.
-
(2010)
Tenth International Symposium on Applications and the Internet
, pp. 395-398
-
-
Vinh, L.T.1
Thang, N.D.2
Lee, Y.K.3
-
44
-
-
0033097744
-
Axiomatic approach to feature subset selection based on relevance
-
Wang, H., Bell, D., Murtagh, F., Axiomatic approach to feature subset selection based on relevance. IEEE Trans. Pattern Anal. Mach.Intell. 21:3 (1999), 271–277.
-
(1999)
IEEE Trans. Pattern Anal. Mach.Intell.
, vol.21
, Issue.3
, pp. 271-277
-
-
Wang, H.1
Bell, D.2
Murtagh, F.3
-
45
-
-
84870056674
-
Data filtering based recursive least squares algorithm for hammerstein systems using the key-term separation principle
-
Wang, D., Ding, F., Chu, Y., Data filtering based recursive least squares algorithm for hammerstein systems using the key-term separation principle. Inf. Sci. 222:3 (2013), 203–212.
-
(2013)
Inf. Sci.
, vol.222
, Issue.3
, pp. 203-212
-
-
Wang, D.1
Ding, F.2
Chu, Y.3
-
46
-
-
84924871671
-
A multi-objective evolutionary algorithm for feature selection based on mutual information with a new redundancy measure
-
Wang, Z., Li, M., Li, J., A multi-objective evolutionary algorithm for feature selection based on mutual information with a new redundancy measure. Inf. Sci. 307 (2015), 73–88.
-
(2015)
Inf. Sci.
, vol.307
, pp. 73-88
-
-
Wang, Z.1
Li, M.2
Li, J.3
-
47
-
-
84963575004
-
Parameter estimation algorithms for multivariable hammerstein CARMA systems
-
Wang, D., Ding, F., Parameter estimation algorithms for multivariable hammerstein CARMA systems. Inf. Sci. 355-356 (2016), 237–248.
-
(2016)
Inf. Sci.
, vol.355-356
, pp. 237-248
-
-
Wang, D.1
Ding, F.2
-
48
-
-
84947087879
-
The damping iterative parameter identification method for dynamical systems based on the sine signal measurement
-
Xu, L., The damping iterative parameter identification method for dynamical systems based on the sine signal measurement. Signal Process. 120 (2016), 660–667.
-
(2016)
Signal Process.
, vol.120
, pp. 660-667
-
-
Xu, L.1
-
49
-
-
84928810630
-
Application of the newton iteration algorithm to the parameter estimation for dynamical systems
-
Xu, L., Application of the newton iteration algorithm to the parameter estimation for dynamical systems. J. Comput. Appl. Math. 288 (2015), 33–43.
-
(2015)
J. Comput. Appl. Math.
, vol.288
, pp. 33-43
-
-
Xu, L.1
-
50
-
-
84961291920
-
Dimensionality reduction by feature clustering for regression problems
-
Xu, R.F., Lee, S.J., Dimensionality reduction by feature clustering for regression problems. Inf. Sci. 299 (2015), 42–57.
-
(2015)
Inf. Sci.
, vol.299
, pp. 42-57
-
-
Xu, R.F.1
Lee, S.J.2
-
52
-
-
84922882884
-
Improved PLS focused on key performance indictor related fault diagnosis
-
Yin, S., Zhu, X., Kaynak, O., Improved PLS focused on key performance indictor related fault diagnosis. IEEE Trans. Ind. Electron. 62:3 (2015), 1651–1658.
-
(2015)
IEEE Trans. Ind. Electron.
, vol.62
, Issue.3
, pp. 1651-1658
-
-
Yin, S.1
Zhu, X.2
Kaynak, O.3
-
53
-
-
84944097156
-
Data-driven process monitoring based on modified orthogonal projections to latent structures
-
Yin, S., Wang, G., Gao, H., Data-driven process monitoring based on modified orthogonal projections to latent structures. IEEE Trans. Control Syst. Technol. 24:4 (2016), 1480–1487.
-
(2016)
IEEE Trans. Control Syst. Technol.
, vol.24
, Issue.4
, pp. 1480-1487
-
-
Yin, S.1
Wang, G.2
Gao, H.3
-
54
-
-
25144492516
-
Efficient feature selection via analysis of relevance and redundancy
-
Yu, L., Liu, H., Efficient feature selection via analysis of relevance and redundancy. J. Mach. Learn. Res. 5 (2004), 1205–1224.
-
(2004)
J. Mach. Learn. Res.
, vol.5
, pp. 1205-1224
-
-
Yu, L.1
Liu, H.2
|