-
2
-
-
0028468293
-
Using mutual information for selecting features in supervised neural net learning
-
Battiti R. Using mutual information for selecting features in supervised neural net learning. IEEE Transactions on Neural Networks 1994, 5(4):537-550.
-
(1994)
IEEE Transactions on Neural Networks
, vol.5
, Issue.4
, pp. 537-550
-
-
Battiti, R.1
-
4
-
-
0036193080
-
A methodology to explain neural network classification
-
Feraud R., Clerot F. A methodology to explain neural network classification. Neural Networks 2002, 15:237-246.
-
(2002)
Neural Networks
, vol.15
, pp. 237-246
-
-
Feraud, R.1
Clerot, F.2
-
5
-
-
0031624596
-
-
A methodology for information theoretic feature extraction. In IEEE world congress on computational intelligence
-
Fisher, J. W., & Principe, J. C. (1998). A methodology for information theoretic feature extraction. In IEEE world congress on computational intelligence (pp. 1712-1716).
-
(1998)
, pp. 1712-1716
-
-
Fisher, J.W.1
Principe, J.C.2
-
6
-
-
0034853398
-
Information processing in dendrites: II information theoretic complexity
-
Gurney K.N. Information processing in dendrites: II information theoretic complexity. Neural Networks 2001, 14:1005-1022.
-
(2001)
Neural Networks
, vol.14
, pp. 1005-1022
-
-
Gurney, K.N.1
-
8
-
-
0033082449
-
Using input parameter influences to support the decisions of feedforward neural networks
-
Howes P., Crook N. Using input parameter influences to support the decisions of feedforward neural networks. Neurocomputing 1999, 24:1999.
-
(1999)
Neurocomputing
, vol.24
, pp. 1999
-
-
Howes, P.1
Crook, N.2
-
9
-
-
0030130724
-
Structural learning with forgetting
-
Ishikawa M. Structural learning with forgetting. Neural Networks 1996, 9(3):509-521.
-
(1996)
Neural Networks
, vol.9
, Issue.3
, pp. 509-521
-
-
Ishikawa, M.1
-
10
-
-
0034551785
-
Rule extraction by successive regularization
-
Ishikawa M. Rule extraction by successive regularization. Neural Networks 2000, 13:1171-1183.
-
(2000)
Neural Networks
, vol.13
, pp. 1171-1183
-
-
Ishikawa, M.1
-
11
-
-
0842343631
-
Information-theoretic competitive learning with inverse Euclidean distance output units
-
Kamimura R. Information-theoretic competitive learning with inverse Euclidean distance output units. Neural Processing Letters 2003, 18:163-184.
-
(2003)
Neural Processing Letters
, vol.18
, pp. 163-184
-
-
Kamimura, R.1
-
12
-
-
0242551550
-
Teacher-directed learning: information-theoretic competitive learning in supervised multi-layered networks
-
Kamimura R. Teacher-directed learning: information-theoretic competitive learning in supervised multi-layered networks. Connection Science 2003, 15:117-140.
-
(2003)
Connection Science
, vol.15
, pp. 117-140
-
-
Kamimura, R.1
-
13
-
-
27344446063
-
Improving information-theoretic competitive learning by accentuated information maximization
-
Kamimura R. Improving information-theoretic competitive learning by accentuated information maximization. International Journal of General Systems 2006, 34(3):219-233.
-
(2006)
International Journal of General Systems
, vol.34
, Issue.3
, pp. 219-233
-
-
Kamimura, R.1
-
14
-
-
40949098164
-
-
Information loss to extract distinctive features in competitive learning. In Proceedings of IEEE conference on systems, man, and cybernetics
-
Kamimura, R. (2007). Information loss to extract distinctive features in competitive learning. In Proceedings of IEEE conference on systems, man, and cybernetics (pp. 1217-1222).
-
(2007)
, pp. 1217-1222
-
-
Kamimura, R.1
-
15
-
-
56349140716
-
-
Conditional information and information loss for flexible feature extraction. In Proceedings of the international joint conference on neural networks, IJCNN2008
-
Kamimura, R. (2008a). Conditional information and information loss for flexible feature extraction. In Proceedings of the international joint conference on neural networks, IJCNN2008 (pp. 2047-2083).
-
(2008)
, pp. 2047-2083
-
-
Kamimura, R.1
-
16
-
-
79951866904
-
-
Feature detection and information loss in competitive learning. In Proceedings of SCIS and ISIS
-
Kamimura, R. (2008b). Feature detection and information loss in competitive learning. In Proceedings of SCIS and ISIS (pp. 1144-1148).
-
(2008)
, pp. 1144-1148
-
-
Kamimura, R.1
-
17
-
-
58049083902
-
Feature discovery by enhancement and relaxation of competitive units
-
Springer, Intelligent data engineering and automated learning-IDEAL2008 (LNCS)
-
Kamimura R. Feature discovery by enhancement and relaxation of competitive units. LNCS 2008, Vol. 5326:148-155. Springer.
-
(2008)
LNCS
, vol.5326
, pp. 148-155
-
-
Kamimura, R.1
-
18
-
-
79951872649
-
-
Free energy-based competitive learning for mutual information maximization. In Proceedings of IEEE conference on systems, man, and cybernetics
-
Kamimura, R. (2008d). Free energy-based competitive learning for mutual information maximization. In Proceedings of IEEE conference on systems, man, and cybernetics (pp. 223-227).
-
(2008)
, pp. 223-227
-
-
Kamimura, R.1
-
19
-
-
62849099253
-
-
Free energy-based competitive learning for self-organizing maps. In Proceedings of artificial intelligence and applications
-
Kamimura, R. (2008e). Free energy-based competitive learning for self-organizing maps. In Proceedings of artificial intelligence and applications (pp. 414-419).
-
(2008)
, pp. 414-419
-
-
Kamimura, R.1
-
20
-
-
68149181024
-
Enhancing and relaxing competitive units for feature discovery
-
Kamimura R. Enhancing and relaxing competitive units for feature discovery. Neural Processing Letters 2009, 30(1):37-57.
-
(2009)
Neural Processing Letters
, vol.30
, Issue.1
, pp. 37-57
-
-
Kamimura, R.1
-
22
-
-
0035528493
-
Flexible feature discovery and structural information control
-
Kamimura R., Kamimura T., Uchida O. Flexible feature discovery and structural information control. Connection Science 2001, 13(4):323-347.
-
(2001)
Connection Science
, vol.13
, Issue.4
, pp. 323-347
-
-
Kamimura, R.1
Kamimura, T.2
Uchida, O.3
-
23
-
-
79951898675
-
-
Methods for interpreting a self-organized map in data analysis. In Proceedings of european symposium on artificial neural networks. Bruges, Belgium.
-
Kaski, S., Nikkila, J., & Kohonen, T. (1998). Methods for interpreting a self-organized map in data analysis. In Proceedings of european symposium on artificial neural networks. Bruges, Belgium.
-
(1998)
-
-
Kaski, S.1
Nikkila, J.2
Kohonen, T.3
-
24
-
-
0030176171
-
Global inhibition for selecting modes of attention
-
Kilmer W. Global inhibition for selecting modes of attention. Neural Networks 1996, 9(4):567-573.
-
(1996)
Neural Networks
, vol.9
, Issue.4
, pp. 567-573
-
-
Kilmer, W.1
-
25
-
-
0031381525
-
Wrappers for feature subset selection
-
Kohavi R., John G. Wrappers for feature subset selection. Artificial Intelligence 1997, 97(1-2):273-324.
-
(1997)
Artificial Intelligence
, vol.97
, Issue.1-2
, pp. 273-324
-
-
Kohavi, R.1
John, G.2
-
28
-
-
2542442752
-
A biased competition computational model of spatial and object-based attention mediating active visual search
-
Lanyon L.J., Denham S.L. A biased competition computational model of spatial and object-based attention mediating active visual search. Neurocomputing 2004, 58-60:655-662.
-
(2004)
Neurocomputing
, pp. 655-662
-
-
Lanyon, L.J.1
Denham, S.L.2
-
29
-
-
2542430278
-
A model of active visual search with object-based attention
-
Lanyon L.J., Denham S.L. A model of active visual search with object-based attention. Neural Networks 2004, 17:873-897.
-
(2004)
Neural Networks
, vol.17
, pp. 873-897
-
-
Lanyon, L.J.1
Denham, S.L.2
-
30
-
-
0029270805
-
Artificial neural networks for feature extraction and multivariate data projection
-
Mao I., Jain A.K. Artificial neural networks for feature extraction and multivariate data projection. IEEE Transactions on Neural Networks 1995, 6(2):296-317.
-
(1995)
IEEE Transactions on Neural Networks
, vol.6
, Issue.2
, pp. 296-317
-
-
Mao, I.1
Jain, A.K.2
-
31
-
-
0035221306
-
Analysis of the internal representations developed by neural networks for structures applied to quantitative structure-activity relationship studies of benzodiazepines
-
Micheli A., Sperduti A., Starita A. Analysis of the internal representations developed by neural networks for structures applied to quantitative structure-activity relationship studies of benzodiazepines. Journal of Chemical Information and Computer Sciences 2001, 41:202-218.
-
(2001)
Journal of Chemical Information and Computer Sciences
, vol.41
, pp. 202-218
-
-
Micheli, A.1
Sperduti, A.2
Starita, A.3
-
32
-
-
0031258584
-
A neural global workspace model for conscious attention
-
Newman J., Baars B.J., Cho S.B. A neural global workspace model for conscious attention. Neural Networks 1997, 10(7):1195-1206.
-
(1997)
Neural Networks
, vol.10
, Issue.7
, pp. 1195-1206
-
-
Newman, J.1
Baars, B.J.2
Cho, S.B.3
-
33
-
-
0032517826
-
A novel method for examination of the variable contribution to computational neural network models
-
Nord L.I., Jacobsson S.P. A novel method for examination of the variable contribution to computational neural network models. Chemometrics and Intelligent Laboratory Systems 1998, 44:153-160.
-
(1998)
Chemometrics and Intelligent Laboratory Systems
, vol.44
, pp. 153-160
-
-
Nord, L.I.1
Jacobsson, S.P.2
-
34
-
-
1942418470
-
Grafting: fast, incremental feature selection by gradient descent in function space
-
Perkins S., Lacker K., Theiler J. Grafting: fast, incremental feature selection by gradient descent in function space. Journal of Machine Learning Research 2003, 3:1333-1356.
-
(2003)
Journal of Machine Learning Research
, vol.3
, pp. 1333-1356
-
-
Perkins, S.1
Lacker, K.2
Theiler, J.3
-
37
-
-
0000646059
-
Learning internal representations by error progagation
-
MIT Press, Cambridge, D.E. Rumelhart, G.E.H. (Eds.)
-
Rumelhart D.E., Hinton G.E., Williams R. Learning internal representations by error progagation. Parallel distributed processing 1986, Vol. 1:318-362. MIT Press, Cambridge. D.E. Rumelhart, G.E.H. (Eds.).
-
(1986)
Parallel distributed processing
, vol.1
, pp. 318-362
-
-
Rumelhart, D.E.1
Hinton, G.E.2
Williams, R.3
-
38
-
-
0002656190
-
Feature discovery by competitive learning
-
Rumelhart D.E., Zipser D. Feature discovery by competitive learning. Cognitive Science 1985, 9:75-112.
-
(1985)
Cognitive Science
, vol.9
, pp. 75-112
-
-
Rumelhart, D.E.1
Zipser, D.2
-
39
-
-
0036565303
-
Extracting of rules from artificial neural networks for nonlinear regression
-
Setiono R., Leow W.K., Zurada J.M. Extracting of rules from artificial neural networks for nonlinear regression. IEEE Transactions on Neural Networks 2002, 13(3):564-577.
-
(2002)
IEEE Transactions on Neural Networks
, vol.13
, Issue.3
, pp. 564-577
-
-
Setiono, R.1
Leow, W.K.2
Zurada, J.M.3
-
40
-
-
3843072084
-
Feature selection in mlps and svms based on maximum output information
-
Sindhwani V., Raskshit S., Deodhare D., Erdogmus D., Principe J.C. Feature selection in mlps and svms based on maximum output information. IEEE Transactions on Neural Networks 2004, 15(4):937-948.
-
(2004)
IEEE Transactions on Neural Networks
, vol.15
, Issue.4
, pp. 937-948
-
-
Sindhwani, V.1
Raskshit, S.2
Deodhare, D.3
Erdogmus, D.4
Principe, J.C.5
-
41
-
-
21544477545
-
The interaction of attention and emotion
-
Taylor J.G., Gragopanagos N.F. The interaction of attention and emotion. Neural Networks 2005, 18:353-369.
-
(2005)
Neural Networks
, vol.18
, pp. 353-369
-
-
Taylor, J.G.1
Gragopanagos, N.F.2
-
42
-
-
0034868179
-
-
Nonlinear feature transform using maximum mutual information. In Proceedings of international joint conference on neural networks
-
Torkkola, K. (2001). Nonlinear feature transform using maximum mutual information. In Proceedings of international joint conference on neural networks (pp. 2756-2761).
-
(2001)
, pp. 2756-2761
-
-
Torkkola, K.1
-
43
-
-
1942450610
-
Feature extraction by non-parametric mutual information maximization
-
Torkkola K. Feature extraction by non-parametric mutual information maximization. Journal of Machine Learning Research 2003, 3:1415-1438.
-
(2003)
Journal of Machine Learning Research
, vol.3
, pp. 1415-1438
-
-
Torkkola, K.1
-
44
-
-
0027678679
-
Extracting refined rules from knowledge-based neural networks
-
Towell G.G., Shavlik J.W. Extracting refined rules from knowledge-based neural networks. Machine Learning 1993, 13:71-101.
-
(1993)
Machine Learning
, vol.13
, pp. 71-101
-
-
Towell, G.G.1
Shavlik, J.W.2
-
45
-
-
79951869185
-
-
U*-matrix: a tool to visualize clusters in high dimensional data. Technical report 36. Department of Computer Science, University of Marburg.
-
Ultsch, A. (2003). U*-matrix: a tool to visualize clusters in high dimensional data. Technical report 36. Department of Computer Science, University of Marburg.
-
(2003)
-
-
Ultsch, A.1
-
46
-
-
0002432308
-
Kohonen self-organization feature maps for exploratory data analysis
-
Kulwer Academic Publisher, Dordrecht
-
Ultsch A., Siemon H.P. Kohonen self-organization feature maps for exploratory data analysis. Proceedings of international neural network conference 1990, 305-308. Kulwer Academic Publisher, Dordrecht.
-
(1990)
Proceedings of international neural network conference
, pp. 305-308
-
-
Ultsch, A.1
Siemon, H.P.2
-
47
-
-
38049168357
-
SOM-based data visualization methods
-
Vesanto J. SOM-based data visualization methods. Intelligent Data Analysis 1999, 3:111-126.
-
(1999)
Intelligent Data Analysis
, vol.3
, pp. 111-126
-
-
Vesanto, J.1
-
48
-
-
79951881886
-
-
SOM toolbox for matlab 5. Technical report A57. Helsinki University of Technology.
-
Vesanto, J., Himberg, J., Alhoniemi, E., & Parhankangas, J. (2000). SOM toolbox for matlab 5. Technical report A57. Helsinki University of Technology.
-
(2000)
-
-
Vesanto, J.1
Himberg, J.2
Alhoniemi, E.3
Parhankangas, J.4
|