-
1
-
-
0030196364
-
Stacked regression
-
Kluwer
-
Breiman, L. 1996. Stacked Regression, Machine Learning, 24:49-64, Kluwer.
-
(1996)
Machine Learning
, vol.24
, pp. 49-64
-
-
Breiman, L.1
-
3
-
-
2542484580
-
Comparing bayes model averaging and stacking when model approximation error cannot be ignored
-
Clarke, C. 2003. Comparing Bayes Model Averaging and Stacking when Model Approximation Error Cannot be ignored, JMLR, 4:683-712.
-
(2003)
JMLR
, vol.4
, pp. 683-712
-
-
Clarke, C.1
-
5
-
-
12144288329
-
Is combining classifiers with stacking better than selecting the best one?
-
Kluwer
-
Dzeroski,S. and Senko,B. 2004. Is Combining Classifiers with Stacking better than selecting the best one? Machine Learning, 54:255-273, Kluwer.
-
(2004)
Machine Learning
, vol.54
, pp. 255-273
-
-
Dzeroski, S.1
Senko, B.2
-
6
-
-
0003684449
-
-
Springer Series in Statistics
-
Hastie, T., Tibshirani, J., Friedman, J. 2001. The elements of statistical Learning, Data mining, Inference and Prediction, Springer Series in Statistics.
-
(2001)
The Elements of Statistical Learning, Data Mining, Inference and Prediction
-
-
Hastie, T.1
Tibshirani, J.2
Friedman, J.3
-
7
-
-
0032139235
-
The random subspace method for constructing decision forests
-
Ho, T.K. 1998a. The random subspace method for constructing decision forests, IEEE PAMI, 20(8):832-844.
-
(1998)
IEEE PAMI
, vol.20
, Issue.8
, pp. 832-844
-
-
Ho, T.K.1
-
9
-
-
85054435084
-
Neural networks ensembles, cross validation, and active learning
-
MIT Press
-
Krogh, A. and Vedelsby, J. 1995. Neural Networks Ensembles, Cross validation, and Active Learning, Advances in Neural Information Processing Systems, MIT Press, pp. 231-238.
-
(1995)
Advances in Neural Information Processing Systems
, pp. 231-238
-
-
Krogh, A.1
Vedelsby, J.2
-
10
-
-
0036532571
-
Switching between selection and fusion in combining classifiers: An experiment
-
Kuncheva L.I. 2002. Switching between selection and fusion in combining classifiers: An experiment, IEEE Transactions on SMC, Part B, 32 (2), 146-156.
-
(2002)
IEEE Transactions on SMC, Part B
, vol.32
, Issue.2
, pp. 146-156
-
-
Kuncheva, L.I.1
-
11
-
-
0003452586
-
Combining estimates in regression and classification
-
Dept. of Statistics, University of Toronto
-
LeBlanc, M. and Tibshirani, R. 1992. Combining estimates in Regression and Classification, Technical Report, Dept. of Statistics, University of Toronto.
-
(1992)
Technical Report
-
-
LeBlanc, M.1
Tibshirani, R.2
-
12
-
-
84957702069
-
A dynamic integration algorithm for an ensemble of classifiers
-
LNAI, Springer-Verlag
-
Puuronen, S., Terziyan, V., Tsymbal, A. 1999. A Dynamic Integration Algorithm for an Ensemble of Classifiers, Foundations of Intelligent Systems, 11th International Symposium ISMIS'99, LNAI, Vol. 1609: 592-600, Springer-Verlag.
-
(1999)
Foundations of Intelligent Systems, 11th International Symposium ISMIS'99
, vol.1609
, pp. 592-600
-
-
Puuronen, S.1
Terziyan, V.2
Tsymbal, A.3
-
14
-
-
8444229122
-
How to make stacking better and faster while also taking care of an unknown weakness
-
Sydney, Australia. Morgan Kaufmann Publishers, San Francisco
-
Seewald, A. How to Make Stacking Better and Faster While Also Taking Care of an Unknown Weakness, In Proceedings of the Nineteenth International Conference on Machine Learning (ICML-2002). Sydney, Australia. Morgan Kaufmann Publishers, San Francisco.
-
Proceedings of the Nineteenth International Conference on Machine Learning (ICML-2002)
-
-
Seewald, A.1
-
15
-
-
0030372023
-
On combining artificial neural nets
-
1996
-
Sharkey, A. 1996. On Combining Artificial Neural Nets, Connection Science 8. pp. 299-314, 1996.
-
(1996)
Connection Science
, vol.8
, pp. 299-314
-
-
Sharkey, A.1
-
17
-
-
84860096983
-
-
www.liacc.up.pt/~ltoreo/Regression/DataSets.html
-
-
-
-
18
-
-
0038137315
-
Ensemble feature selection with the simple Bayesian classification
-
Elsevier
-
Tsymbal, A., Puuronen, S., Patterson, D. 2003. Ensemble feature selection with the simple Bayesian classification, Information Fusion Vol. 4:87-100, Elsevier.
-
(2003)
Information Fusion
, vol.4
, pp. 87-100
-
-
Tsymbal, A.1
Puuronen, S.2
Patterson, D.3
-
20
-
-
0026692226
-
Stacked generalization
-
Wolpert, D. 1992. Stacked Generalization, Neural Networks 5, pp. 241-259.
-
(1992)
Neural Networks
, vol.5
, pp. 241-259
-
-
Wolpert, D.1
-
21
-
-
0038492736
-
Combining stacking with bagging to improve a learning algorithm
-
Santa Fe Institute, Santa Fe
-
Wolpert, D and Macready, W. 1996. Combining Stacking with Bagging to improve a learning algorithm, Technical Report, Santa Fe Institute, Santa Fe.
-
(1996)
Technical Report
-
-
Wolpert, D.1
Macready, W.2
-
22
-
-
84948152666
-
Using diversity in preparing ensemble of classifiers based on different feature subsets to minimize generalization error
-
LNCS, Springer
-
th ECML, LNCS, Vol 2167, pp. 576-587, Springer.
-
(2001)
th ECML
, vol.2167
, pp. 576-587
-
-
Zenobi, G.1
Cunningham, P.2
-
23
-
-
0036567392
-
Ensembling neural networks: Many could be better than all
-
Elsevier
-
Zhou, Z, Wu, J., Tang W. 2002. Ensembling Neural Networks: Many Could Be Better than All, Artificial Intelligence, Vol. 137, no 1-2, pp. 2390263, Elsevier
-
(2002)
Artificial Intelligence
, vol.137
, Issue.1-2
, pp. 2390263
-
-
Zhou, Z.1
Wu, J.2
Tang, W.3
|