메뉴 건너뛰기




Volumn 3512, Issue , 2005, Pages 598-603

Cascade ensembles

Author keywords

[No Author keywords available]

Indexed keywords

ARTIFICIAL INTELLIGENCE; CLASSIFICATION (OF INFORMATION); COMPUTER SCIENCE; PROBLEM SOLVING; REGRESSION ANALYSIS;

EID: 25144453771     PISSN: 03029743     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1007/11494669_73     Document Type: Conference Paper
Times cited : (10)

References (16)
  • 1
    • 0033556929 scopus 로고    scopus 로고
    • Booested mixture of experts: An ensemble learning scheme
    • R. Avnimelech and N. Intrator. Booested mixture of experts: an ensemble learning scheme. Neural Computation, 11(2):483-497, 1999.
    • (1999) Neural Computation , vol.11 , Issue.2 , pp. 483-497
    • Avnimelech, R.1    Intrator, N.2
  • 3
    • 12144288329 scopus 로고    scopus 로고
    • Is combining classifiers with stacking better than selecting the best one?
    • S. Dzeroski and B. Zenko. Is combining classifiers with stacking better than selecting the best one? Machine Learning, 54:255-273, 2004.
    • (2004) Machine Learning , vol.54 , pp. 255-273
    • Dzeroski, S.1    Zenko, B.2
  • 4
    • 0000155950 scopus 로고
    • The cascade-correlation architecture
    • D. S. Touretzky, editor, San Mateo, CA, . Morgan Kauffman
    • S. E. Fahlman and C. Lebiere. The cascade-correlation architecture. In D. S. Touretzky, editor, Advances in Neural Information Systems 2, pages 524-532, San Mateo, CA, 1990. Morgan Kauffman.
    • (1990) Advances in Neural Information Systems , vol.2 , pp. 524-532
    • Fahlman, S.E.1    Lebiere, C.2
  • 5
    • 0141921552 scopus 로고    scopus 로고
    • Online ensemble learning: An empirical study
    • A. Fern and R. Givan. Online ensemble learning: An empirical study. Machine Learning, 53:71-109, 2003.
    • (2003) Machine Learning , vol.53 , pp. 71-109
    • Fern, A.1    Givan, R.2
  • 6
    • 84867091494 scopus 로고    scopus 로고
    • Dynamic classifier selection
    • Multiple Classifier Systems 2000
    • G. Giacinto and F. Roli. Dynamic classifier selection. In Multiple Classifier Systems 2000, volume 1857 of Lecture Notes in Computer Science, pages 177-189, 2000.
    • (2000) Lecture Notes in Computer Science , vol.1857 , pp. 177-189
    • Giacinto, G.1    Roli, F.2
  • 7
    • 0033207482 scopus 로고    scopus 로고
    • Combining predictors: Comparison of five meta machine learning methods
    • J. Hansen. Combining predictors: Comparison of five meta machine learning methods. Information Science, 119(1-2):91-105, 1999.
    • (1999) Information Science , vol.119 , Issue.1-2 , pp. 91-105
    • Hansen, J.1
  • 8
    • 0034315099 scopus 로고    scopus 로고
    • Evolutionary ensembles with negative correlation learning
    • November
    • Y. Liu, X. Yao, and T. Higuchi. Evolutionary ensembles with negative correlation learning. IEEE Transactions on Evolutionary Computation, 4(4):380-387, November 2000.
    • (2000) IEEE Transactions on Evolutionary Computation , vol.4 , Issue.4 , pp. 380-387
    • Liu, Y.1    Yao, X.2    Higuchi, T.3
  • 9
    • 0034863476 scopus 로고    scopus 로고
    • Evolving a cooperative population of neural networks by minimizing mutual information
    • Seoul, Korea, May
    • Y. Liu, X. Yao, Q. Zhao, and T. Higuchi. Evolving a cooperative population of neural networks by minimizing mutual information. In Proc. of the 2001 IEEE Congress on Evolutionary Computation, pages 384-389, Seoul, Korea, May 2001.
    • (2001) Proc. of the 2001 IEEE Congress on Evolutionary Computation , pp. 384-389
    • Liu, Y.1    Yao, X.2    Zhao, Q.3    Higuchi, T.4
  • 10
    • 0032661927 scopus 로고    scopus 로고
    • Using correspondence analysis to combine classifiers
    • July
    • C. J. Merz. Using correspondence analysis to combine classifiers. Machine Learning, 36(1):33-58, July 1999.
    • (1999) Machine Learning , vol.36 , Issue.1 , pp. 33-58
    • Merz, C.J.1
  • 11
    • 0000926506 scopus 로고
    • When networks disagree: Ensemble methods for hybrid neural networks
    • R. J. Mammone, editor, Chapman - Hall
    • M. P. Perrone and L. N. Cooper. When networks disagree: Ensemble methods for hybrid neural networks. In R. J. Mammone, editor, Neural Networks for Speech and Image Processing, pages 126-142. Chapman - Hall, 1993.
    • (1993) Neural Networks for Speech and Image Processing , pp. 126-142
    • Perrone, M.P.1    Cooper, L.N.2
  • 12
    • 0141826925 scopus 로고
    • Proben1 - A set of neural network benchmark problems and bench-marking rules
    • Fakultät für Informatik, Universität Karlsruhe, Karlsruhe, Germany, September
    • L. Prechelt. Proben1 - A set of neural network benchmark problems and bench-marking rules. Technical Report 21/94, Fakultät für Informatik, Universität Karlsruhe, Karlsruhe, Germany, September 1994.
    • (1994) Technical Report , vol.21 , Issue.94
    • Prechelt, L.1
  • 13
    • 0030372023 scopus 로고    scopus 로고
    • On combining artificial neural nets
    • A. J. C. Sharkey. On combining artificial neural nets. Connection Science, 8:299-313, 1996.
    • (1996) Connection Science , vol.8 , pp. 299-313
    • Sharkey, A.J.C.1
  • 14
    • 0034247206 scopus 로고    scopus 로고
    • Multiboosting: A technique for combining boosting and wagging
    • August
    • G. I. Webb. Multiboosting: A technique for combining boosting and wagging. Machine Learning, 40(2):159-196, August 2000.
    • (2000) Machine Learning , vol.40 , Issue.2 , pp. 159-196
    • Webb, G.I.1
  • 16
    • 0036567392 scopus 로고    scopus 로고
    • Ensembling neural networks: Many could be better than all
    • May
    • Z-H. Zhou, J. Wu, and W. Tang. Ensembling neural networks: Many could be better than all. Artificial Intelligence, 137(1-2):239-253, May 2002.
    • (2002) Artificial Intelligence , vol.137 , Issue.1-2 , pp. 239-253
    • Zhou, Z.-H.1    Wu, J.2    Tang, W.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.