메뉴 건너뛰기




Volumn 2600, Issue , 2003, Pages 118-183

An introduction to boosting and leveraging

Author keywords

[No Author keywords available]

Indexed keywords

ALGORITHMS; ARTIFICIAL INTELLIGENCE;

EID: 35248862907     PISSN: 03029743     EISSN: 16113349     Source Type: Book Series    
DOI: 10.1007/3-540-36434-x_4     Document Type: Article
Times cited : (271)

References (201)
  • 2
    • 0016355478 scopus 로고
    • A new look at the statistical model identification
    • H. Akaike. A new look at the statistical model identification. IEEE Trans. Automat. Control, 19(6):716-723, 1974.
    • (1974) IEEE Trans. Automat. Control , vol.19 , Issue.6 , pp. 716-723
    • Akaike, H.1
  • 5
    • 0042967689 scopus 로고    scopus 로고
    • Data-dependent margin-based generalization bounds for classification
    • A. Antos, B. Kégl, T. Linder, and G. Lugosi. Data-dependent margin-based generalization bounds for classification. JMLR, 3:73-98, 2002.
    • (2002) JMLR , vol.3 , pp. 73-98
    • Antos, A.1    Kégl, B.2    Linder, T.3    Lugosi, G.4
  • 6
    • 0012372245 scopus 로고    scopus 로고
    • Improving algorithms for boosting
    • San Francisco, Morgan Kaufmann
    • J.A. Aslam. Improving algorithms for boosting. In Proc. COLT, San Francisco, 2000. Morgan Kaufmann.
    • (2000) Proc. COLT
    • Aslam, J.A.1
  • 10
    • 84937440359 scopus 로고    scopus 로고
    • Localized rademacher averages
    • Procedings COLT'02, Sydney, Springer
    • P.L. Bartlett, O. Bousquet, and S. Mendelson. Localized rademacher averages. In Procedings COLT'02, volume 2375 of LNAI, pages 44-58, Sydney, 2002. Springer.
    • (2002) LNAI , vol.2375 , pp. 44-58
    • Bartlett, P.L.1    Bousquet, O.2    Mendelson, S.3
  • 11
    • 0041413385 scopus 로고    scopus 로고
    • Rademacher and gaussian complexities: Risk bounds and structural results
    • to appear 10/02
    • P.L. Bartlett and S. Mendelson. Rademacher and gaussian complexities: Risk bounds and structural results. Journal of Machine Learning Research, 2002. to appear 10/02.
    • (2002) Journal of Machine Learning Research
    • Bartlett, P.L.1    Mendelson, S.2
  • 12
    • 0032645080 scopus 로고    scopus 로고
    • An empirical comparison of voting classification algorithm: Bagging, boosting and variants
    • E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithm: Bagging, boosting and variants. Machine Learning, 36:105-142, 1999.
    • (1999) Machine Learning , vol.36 , pp. 105-142
    • Bauer, E.1    Kohavi, R.2
  • 13
    • 21744459385 scopus 로고    scopus 로고
    • Legendre functions and the method of random Bregman projections
    • H.H. Bauschke and J.M. Borwein. Legendre functions and the method of random Bregman projections. Journal of Convex Analysis, 4:27-67, 1997.
    • (1997) Journal of Convex Analysis , vol.4 , pp. 27-67
    • Bauschke, H.H.1    Borwein, J.M.2
  • 16
  • 17
    • 0026860799 scopus 로고
    • Robust linear programming discrimination of two linearly inseparable sets
    • K.P. Bennett and O.L. Mangasarian. Robust linear programming discrimination of two linearly inseparable sets. Optimization Methods and Software, 1:23-34, 1992.
    • (1992) Optimization Methods and Software , vol.1 , pp. 23-34
    • Bennett, K.P.1    Mangasarian, O.L.2
  • 18
    • 84956662941 scopus 로고    scopus 로고
    • A boosting algorithm for regression
    • W.Gerstner, A.Germond, M.Hasler, and J.-D. Nicoud, editors, Proceedings ICANN'97, Int. Conf. on Artificial Neural Networks, Berlin, Springer
    • A. Bertoni, P. Campadelli, and M. Parodi. A boosting algorithm for regression. In W.Gerstner, A.Germond, M.Hasler, and J.-D. Nicoud, editors, Proceedings ICANN'97, Int. Conf. on Artificial Neural Networks, volume V of LNCS, pages 343-348, Berlin, 1997. Springer.
    • (1997) LNCS , vol.5 , pp. 343-348
    • Bertoni, A.1    Campadelli, P.2    Parodi, M.3
  • 23
    • 0002709342 scopus 로고    scopus 로고
    • Feature selection via concave minimization and support vector machines
    • Morgan Kaufmann, San Francisco, CA
    • P.S. Bradley and O.L. Mangasarian. Feature selection via concave minimization and support vector machines. In Proc. 15th International Conf. on Machine Learning, pages 82-90. Morgan Kaufmann, San Francisco, CA, 1998.
    • (1998) Proc. 15th International Conf. on Machine Learning , pp. 82-90
    • Bradley, P.S.1    Mangasarian, O.L.2
  • 24
    • 49949144765 scopus 로고
    • The relaxation method for finding the common point of convex sets and its application to the solution of problems in convex programming
    • L.M. Bregman. The relaxation method for finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Computational Math, and Math. Physics, 7:200-127, 1967.
    • (1967) USSR Computational Math. and Math. Physics , vol.7 , pp. 200-1127
    • Bregman, L.M.1
  • 25
    • 0030211964 scopus 로고    scopus 로고
    • Bagging predictors
    • L. Breiman. Bagging predictors. Machine Learning, 26(2):123-140, 1996.
    • (1996) Machine Learning , vol.26 , Issue.2 , pp. 123-140
    • Breiman, L.1
  • 26
    • 0003619255 scopus 로고    scopus 로고
    • Technical Report 460, Statistics Department, University of California, July
    • L. Breiman. Bias, variance, and arcing classifiers. Technical Report 460, Statistics Department, University of California, July 1997.
    • (1997) Bias, Variance, and Arcing Classifiers
    • Breiman, L.1
  • 27
    • 0000275022 scopus 로고    scopus 로고
    • Prediction games and arcing algorithms
    • Also Technical Report 504, Statistics Department, University of California Berkeley
    • L. Breiman. Prediction games and arcing algorithms. Neural Computation, 11(7):1493-1518, 1999. Also Technical Report 504, Statistics Department, University of California Berkeley.
    • (1999) Neural Computation , vol.11 , Issue.7 , pp. 1493-1518
    • Breiman, L.1
  • 30
    • 35248866734 scopus 로고    scopus 로고
    • On boosting with polynomially bounded distributions
    • Accepted
    • N. Bshouty and D. Gavinsky. On boosting with polynomially bounded distributions. JMLR, pages 107-111, 2002. Accepted.
    • (2002) JMLR , pp. 107-111
    • Bshouty, N.1    Gavinsky, D.2
  • 31
    • 35248892220 scopus 로고    scopus 로고
    • Boosting with the 12 loss: Regression and classification
    • revised, also Technical Report 605, Stat Dept, UC Berkeley August, 2001
    • P. Buhlmann and B. Yu. Boosting with the 12 loss: Regression and classification. J. Amer. Statist. Assoc., 2002. revised, also Technical Report 605, Stat Dept, UC Berkeley August, 2001.
    • (2002) J. Amer. Statist. Assoc.
    • Buhlmann, P.1    Yu, B.2
  • 32
    • 84898950762 scopus 로고    scopus 로고
    • A linear programming approach to novelty detection
    • T.K. Leen, T.G. Dietterich, and V. Tresp, editors, MIT Press
    • C. Campbell and K.P. Bennett. A linear programming approach to novelty detection. In T.K. Leen, T.G. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems, volume 13, pages 395-401. MIT Press, 2001.
    • (2001) Advances in Neural Information Processing Systems , vol.13 , pp. 395-401
    • Campbell, C.1    Bennett, K.P.2
  • 33
    • 0343248361 scopus 로고
    • Non-intrusive appliance load monitoring system
    • Electric Power Research Institute
    • J. Carmichael. Non-intrusive appliance load monitoring system. Epri journal, Electric Power Research Institute, 1990.
    • (1990) Epri Journal
    • Carmichael, J.1
  • 35
    • 0028463049 scopus 로고
    • Bounds on approximate steepest descent for likelihood maximization in exponential families
    • July
    • N. Cesa-Bianchi, A. Krogh, and M. Warmuth. Bounds on approximate steepest descent for likelihood maximization in exponential families. IEEE Transaction on Information Theory, 40(4):1215-1220, July 1994.
    • (1994) IEEE Transaction on Information Theory , vol.40 , Issue.4 , pp. 1215-1220
    • Cesa-Bianchi, N.1    Krogh, A.2    Warmuth, M.3
  • 36
    • 0036161011 scopus 로고    scopus 로고
    • Choosing multiple parameters for support vector machines
    • O. Chapelle, V. Vapnik, O. Bousquet, and S. Mukherjee. Choosing multiple parameters for support vector machines. Machine Learning, 46(1):131-159, 2002.
    • (2002) Machine Learning , vol.46 , Issue.1 , pp. 131-159
    • Chapelle, O.1    Vapnik, V.2    Bousquet, O.3    Mukherjee, S.4
  • 38
    • 28444495042 scopus 로고    scopus 로고
    • Learning to order things
    • Michael I. Jordan, Michael J. Kearns, and Sara A. Solla, editors, The MIT Press
    • W.W. Cohen, R.E. Schapire, and Y. Singer. Learning to order things. In Michael I. Jordan, Michael J. Kearns, and Sara A. Solla, editors, Advances in Neural Information Processing Systems, volume 10. The MIT Press, 1998.
    • (1998) Advances in Neural Information Processing Systems , vol.10
    • Cohen, W.W.1    Schapire, R.E.2    Singer, Y.3
  • 39
    • 0036643072 scopus 로고    scopus 로고
    • Logistic Regression, AdaBoost and Bregman distances
    • Special Issue on New Methods for Model Selection and Model Combination
    • M. Collins, R.E. Schapire, and Y. Singer. Logistic Regression, AdaBoost and Bregman distances. Machine Learning, 48(1-3):253-285, 2002. Special Issue on New Methods for Model Selection and Model Combination.
    • (2002) Machine Learning , vol.48 , Issue.1-3 , pp. 253-285
    • Collins, M.1    Schapire, R.E.2    Singer, Y.3
  • 40
    • 0028543354 scopus 로고
    • A stable exponential penalty algorithm with superlinear convergence
    • Nov
    • R. Cominetti and J.-P. Dussault. A stable exponential penalty algorithm with superlinear convergence. J.O.T.A., 83(2), Nov 1994.
    • (1994) J.O.T.A. , vol.83 , Issue.2
    • Cominetti, R.1    Dussault, J.-P.2
  • 41
  • 43
    • 0000541146 scopus 로고
    • Asymptotic analysis of penalized likelihood and related estimates
    • D.D. Cox and F. O'Sullivan. Asymptotic analysis of penalized likelihood and related estimates. The Annals of Statistics, 18(4):1676-1695, 1990.
    • (1990) The Annals of Statistics , vol.18 , Issue.4 , pp. 1676-1695
    • Cox, D.D.1    O'Sullivan, F.2
  • 44
    • 0010099436 scopus 로고    scopus 로고
    • On the learnability and design of output codes for multiclass problems
    • N. Cesa-Bianchi and S. Goldberg, editors, San Francisco, Morgan Kaufmann
    • K. Crammer and Y. Singer. On the learnability and design of output codes for multiclass problems. In N. Cesa-Bianchi and S. Goldberg, editors, Proc. Colt, pages 35-46, San Francisco, 2000. Morgan Kaufmann.
    • (2000) Proc. Colt , pp. 35-46
    • Crammer, K.1    Singer, Y.2
  • 51
    • 0034250160 scopus 로고    scopus 로고
    • An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization
    • T.G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, 40(2):139-157, 1999.
    • (1999) Machine Learning , vol.40 , Issue.2 , pp. 139-157
    • Dietterich, T.G.1
  • 53
    • 0005271994 scopus 로고    scopus 로고
    • A modification of AdaBoost
    • San Francisco, Morgan Kaufmann
    • C. Domingo and O. Watanabe. A modification of AdaBoost. In Proc. COLT, San Francisco, 2000. Morgan Kaufmann.
    • (2000) Proc. COLT
    • Domingo, C.1    Watanabe, O.2
  • 56
    • 84947765278 scopus 로고    scopus 로고
    • A geometric approach to leveraging weak learners
    • P. Fischer and H. U. Simon, editors, March Long version to appear in TCS
    • N. Duffy and D.P. Helmbold. A geometric approach to leveraging weak learners. In P. Fischer and H. U. Simon, editors, Computational Learning Theory: 4th European Conference (EuroCOLT '99), pages 18-33, March 1999. Long version to appear in TCS.
    • (1999) Computational Learning Theory: 4th European Conference (EuroCOLT '99) , pp. 18-33
    • Duffy, N.1    Helmbold, D.P.2
  • 57
    • 35248886374 scopus 로고    scopus 로고
    • Technical report, Department of Computer Science, University of Santa Cruz
    • N. Duffy and D.P. Helmbold. Boosting methods for regression. Technical report, Department of Computer Science, University of Santa Cruz, 2000.
    • (2000) Boosting Methods for Regression
    • Duffy, N.1    Helmbold, D.P.2
  • 58
    • 0005003947 scopus 로고    scopus 로고
    • Leveraging for regression
    • San Francisco, Morgan Kaufmann
    • N. Duffy and D.P. Helmbold. Leveraging for regression. In Proc. COLT, pages 208-219, San Francisco, 2000. Morgan Kaufmann.
    • (2000) Proc. COLT , pp. 208-219
    • Duffy, N.1    Helmbold, D.P.2
  • 59
    • 84898984337 scopus 로고    scopus 로고
    • Potential boosters?
    • S.A. Solla, T.K. Leen, and K.-R. Müller, editors, MIT Press
    • N. Duffy and D.P. Helmbold. Potential boosters? In S.A. Solla, T.K. Leen, and K.-R. Müller, editors, Advances in Neural Information Processing Systems, volume 12, pages 258-264. MIT Press, 2000.
    • (2000) Advances in Neural Information Processing Systems , vol.12 , pp. 258-264
    • Duffy, N.1    Helmbold, D.P.2
  • 63
    • 0006444313 scopus 로고    scopus 로고
    • Technical report, Dep. of Computer Science and Electrical Engineering, University of Queensland
    • M. Prean and T. Downs. A simple cost function for boosting. Technical report, Dep. of Computer Science and Electrical Engineering, University of Queensland, 1998.
    • (1998) A Simple Cost Function for Boosting
    • Prean, M.1    Downs, T.2
  • 64
    • 58149321460 scopus 로고
    • Boosting a weak learning algorithm by majority
    • September
    • Y. Freund. Boosting a weak learning algorithm by majority. Information and Computation, 121(2):256-285, September 1995.
    • (1995) Information and Computation , vol.121 , Issue.2 , pp. 256-285
    • Freund, Y.1
  • 65
    • 0035371148 scopus 로고    scopus 로고
    • An adaptive version of the boost by majority algorithm
    • Y. Freund. An adaptive version of the boost by majority algorithm. Machine Learning, 43(3):293-318, 2001.
    • (2001) Machine Learning , vol.43 , Issue.3 , pp. 293-318
    • Freund, Y.1
  • 66
  • 67
    • 0002897323 scopus 로고
    • A decision-theoretic generalization of on-line learning and an application to boosting
    • EuroCOLT: European Conference on Computational Learning Theory
    • Y. Freund and R.E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. In EuroCOLT: European Conference on Computational Learning Theory. LNCS, 1994.
    • (1994) LNCS
    • Freund, Y.1    Schapire, R.E.2
  • 69
    • 0030419058 scopus 로고    scopus 로고
    • Game theory, on-line prediction and boosting
    • New York, NY, ACM Press
    • Y. Freund and R.E. Schapire. Game theory, on-line prediction and boosting. In Proc. COLT, pages 325-332, New York, NY, 1996. ACM Press.
    • (1996) Proc. COLT , pp. 325-332
    • Freund, Y.1    Schapire, R.E.2
  • 70
    • 0031211090 scopus 로고    scopus 로고
    • A decision-theoretic generalization of on-line learning and an application to boosting
    • Y. Freund and R.E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119-139, 1997.
    • (1997) Journal of Computer and System Sciences , vol.55 , Issue.1 , pp. 119-139
    • Freund, Y.1    Schapire, R.E.2
  • 71
    • 0002267135 scopus 로고    scopus 로고
    • Adaptive game playing using multiplicative weights
    • Y. Freund and R.E. Schapire. Adaptive game playing using multiplicative weights. Games and Economic Behavior, 29:79-103, 1999.
    • (1999) Games and Economic Behavior , vol.29 , pp. 79-103
    • Freund, Y.1    Schapire, R.E.2
  • 72
    • 0001963082 scopus 로고    scopus 로고
    • A short introduction to boosting
    • September Appeared in Japanese, translation by Naoki Abe
    • Y. Freund and R.E. Schapire. A short introduction to boosting. Journal of Japanese Society for Artificial Intelligence, 14(5):771-780, September 1999. Appeared in Japanese, translation by Naoki Abe.
    • (1999) Journal of Japanese Society for Artificial Intelligence , vol.14 , Issue.5 , pp. 771-780
    • Freund, Y.1    Schapire, R.E.2
  • 74
    • 0034164230 scopus 로고    scopus 로고
    • Additive logistic regression: A statistical view of boosting
    • with discussion pp.375-407, also Technical Report at Department of Statistics, Sequoia Hall, Stanford University
    • J. Friedman, T. Hastie, and R.J. Tibshirani. Additive logistic regression: a statistical view of boosting. Annals of Statistics, 2:337-374, 2000. with discussion pp.375-407, also Technical Report at Department of Statistics, Sequoia Hall, Stanford University.
    • (2000) Annals of Statistics , vol.2 , pp. 337-374
    • Friedman, J.1    Hastie, T.2    Tibshirani, R.J.3
  • 75
    • 21744462998 scopus 로고    scopus 로고
    • On bias, variance, 0/1-loss, and the corse of dimensionality
    • Kluwer Academic Publishers
    • J.H. Friedman. On bias, variance, 0/1-loss, and the corse of dimensionality. In Data Mining and Knowledge Discovery, volume I, pages 55-77. Kluwer Academic Publishers, 1997.
    • (1997) Data Mining and Knowledge Discovery , vol.1 , pp. 55-77
    • Friedman, J.H.1
  • 76
    • 0003591748 scopus 로고    scopus 로고
    • Technical report, Department of Statistics, Stanford University, February
    • J.H. Friedman. Greedy function approximation. Technical report, Department of Statistics, Stanford University, February 1999.
    • (1999) Greedy Function Approximation
    • Friedman, J.H.1
  • 79
    • 84958949175 scopus 로고    scopus 로고
    • Bagging can stabilize without reducing variance
    • ICANN'01, Springer
    • Y. Grandvalet. Bagging can stabilize without reducing variance. In ICANN'01, Lecture Notes in Computer Science. Springer, 2001.
    • (2001) Lecture Notes in Computer Science
    • Grandvalet, Y.1
  • 80
    • 85068625434 scopus 로고    scopus 로고
    • Boosting mixture models for semi-supervised tasks
    • Vienna, Austria
    • Y. Grandvalet, F. D'alché-Buc, and C. Ambroise. Boosting mixture models for semi-supervised tasks. In Proc. ICANN, Vienna, Austria, 2001.
    • (2001) Proc. ICANN
    • Grandvalet, Y.1    D'alché-Buc, F.2    Ambroise, C.3
  • 83
    • 0026970695 scopus 로고
    • Non-intrusive appliance load monitoring
    • W. Hart. Non-intrusive appliance load monitoring. Proceedings of the IEEE, 80(12), 1992.
    • (1992) Proceedings of the IEEE , vol.80 , Issue.12
    • Hart, W.1
  • 84
    • 0032632353 scopus 로고    scopus 로고
    • Using decision trees to construct a practical parser
    • M. Haruno, S. Shirai, and Y. Ooyama. Using decision trees to construct a practical parser. Machine Learning, 34:131-149, 1999.
    • (1999) Machine Learning , vol.34 , pp. 131-149
    • Haruno, M.1    Shirai, S.2    Ooyama, Y.3
  • 87
    • 0002192516 scopus 로고
    • Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications
    • D. Haussler. Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications. Information and Computation, 100:78-150, 1992.
    • (1992) Information and Computation , vol.100 , pp. 78-150
    • Haussler, D.1
  • 90
    • 0003946694 scopus 로고    scopus 로고
    • Learning Linear Classifiers: Theory and Algorithms
    • MIT Press
    • R. Herbrich. Learning Linear Classifiers: Theory and Algorithms, volume 7 of Adaptive Computation and Machine Learning. MIT Press, 2002.
    • (2002) Adaptive Computation and Machine Learning , vol.7
    • Herbrich, R.1
  • 91
    • 0347712436 scopus 로고    scopus 로고
    • Sparsity vs. large margins for linear classifiers
    • San Francisco, Morgan Kaufmann
    • R. Herbrich, T. Graepel, and J. Shawe-Taylor. Sparsity vs. large margins for linear classifiers. In Proc. COLT, pages 304-308, San Francisco, 2000. Morgan Kaufmann.
    • (2000) Proc. COLT , pp. 304-308
    • Herbrich, R.1    Graepel, T.2    Shawe-Taylor, J.3
  • 92
    • 0041965981 scopus 로고    scopus 로고
    • Algorithmic luckiness
    • R. Herbrich and R. Williamson. Algorithmic luckiness. JMLR, 3:175-212, 2002.
    • (2002) JMLR , vol.3 , pp. 175-212
    • Herbrich, R.1    Williamson, R.2
  • 93
    • 0027657329 scopus 로고
    • Semi-infinite programming: Theory, methods and applications
    • September
    • R. Hettich and K.O. Kortanek. Semi-infinite programming: Theory, methods and applications. SIAM Review, 3:380-429, September 1993.
    • (1993) SIAM Review , vol.3 , pp. 380-429
    • Hettich, R.1    Kortanek, K.O.2
  • 99
    • 0000262562 scopus 로고
    • Hierarchical mixtures of experts and the EM algorithm
    • M.I. Jordan and R.A. Jacobs. Hierarchical mixtures of experts and the EM algorithm. Neural Computation, 6(2):181-214, 1994.
    • (1994) Neural Computation , vol.6 , Issue.2 , pp. 181-214
    • Jordan, M.I.1    Jacobs, R.A.2
  • 100
    • 0029700730 scopus 로고    scopus 로고
    • On the boosting ability og top-down decision tree learning algorithms
    • ACM Press
    • M. Kearns and Y. Mansour. On the boosting ability og top-down decision tree learning algorithms. In Proc. 28th ACM Symposium on the Theory of Computing, pages 459-468. ACM Press, 1996.
    • (1996) Proc. 28th ACM Symposium on the Theory of Computing , pp. 459-468
    • Kearns, M.1    Mansour, Y.2
  • 101
    • 0028324717 scopus 로고
    • Cryptographic limitations on learning Boolean formulae and finite automata
    • January
    • M. Kearns and L. Valiant. Cryptographic limitations on learning Boolean formulae and finite automata. Journal of the ACM, 41(1):67-95, January 1994.
    • (1994) Journal of the ACM , vol.41 , Issue.1 , pp. 67-95
    • Kearns, M.1    Valiant, L.2
  • 103
    • 0015000439 scopus 로고
    • Some results on Tchebycheffian spline functions
    • G.S. Kimeldorf and G. Wahba. Some results on Tchebycheffian spline functions. J. Math. Anal. Applic., 33:82-95, 1971.
    • (1971) J. Math. Anal. Applic. , vol.33 , pp. 82-95
    • Kimeldorf, G.S.1    Wahba, G.2
  • 105
    • 0031375503 scopus 로고    scopus 로고
    • The perceptron algorithm vs. winnow: Linear vs. logarithmic mistake bounds when few input variables are relevant
    • J. Kivinen, M. Warmuth, and P. Auer. The perceptron algorithm vs. winnow: Linear vs. logarithmic mistake bounds when few input variables are relevant. Special issue of Artificial Intelligence, 97(1-2):325-343, 1997.
    • (1997) Special Issue of Artificial Intelligence , vol.97 , Issue.1-2 , pp. 325-343
    • Kivinen, J.1    Warmuth, M.2    Auer, P.3
  • 106
    • 0008815681 scopus 로고    scopus 로고
    • Additive versus exponentiated gradient updates for linear prediction
    • J. Kivinen and M.K. Warmuth. Additive versus exponentiated gradient updates for linear prediction. Information and Computation, 132(1):1-64, 1997.
    • (1997) Information and Computation , vol.132 , Issue.1 , pp. 1-64
    • Kivinen, J.1    Warmuth, M.K.2
  • 107
    • 11744321022 scopus 로고    scopus 로고
    • Relaxation methods for strictly convex regularizations of piecewise linear programs
    • K.C. Kiwiel. Relaxation methods for strictly convex regularizations of piecewise linear programs. Applied Mathematics and Optimization, 38:239-259, 1998.
    • (1998) Applied Mathematics and Optimization , vol.38 , pp. 239-259
    • Kiwiel, K.C.1
  • 108
    • 0036104545 scopus 로고    scopus 로고
    • Empirical margin distributions and bounding the generalization error of combined classifiers
    • V. Koltchinksii and D. Panchenko. Empirical margin distributions and bounding the generalization error of combined classifiers. Ann. Statis., 30(1), 2002.
    • Ann. Statis. , vol.30 , Issue.1 , pp. 2002
    • Koltchinksii, V.1    Panchenko, D.2
  • 110
    • 0033280975 scopus 로고    scopus 로고
    • Additive models, boosting, and inference for generalized divergences
    • New York, NY, ACM Press
    • 110: J. Lafferty. Additive models, boosting, and inference for generalized divergences. In Proc. 12th Annu. Conf. on Comput. Learning Theory, pages 125-133, New York, NY, 1999. ACM Press.
    • (1999) Proc. 12th Annu. Conf. on Comput. Learning Theory , pp. 125-133
    • Lafferty, J.1
  • 111
    • 84898999495 scopus 로고    scopus 로고
    • Boosting and maximum likelihood for exponential models
    • to appear. Longer version also NeuroCOLT Technical Report NC-TR-2001-098
    • G. Lebanon and J. Lafferty. Boosting and maximum likelihood for exponential models. In Advances in Neural information processings systems, volume 14, 2002. to appear. Longer version also NeuroCOLT Technical Report NC-TR-2001-098.
    • (2002) Advances in Neural Information Processings Systems , vol.14
    • Lebanon, G.1    Lafferty, J.2
  • 113
    • 0027262895 scopus 로고
    • Multilayer Feedforward Networks with a Nonpolynomial Activation Function Can Approximate any Function
    • M. Leshno, V. Lin, A. Pinkus, and S. Schocken. Multilayer Feedforward Networks with a Nonpolynomial Activation Function Can Approximate any Function. Neural Networks, 6:861-867, 1993.
    • (1993) Neural Networks , vol.6 , pp. 861-867
    • Leshno, M.1    Lin, V.2    Pinkus, A.3    Schocken, S.4
  • 114
    • 0001928981 scopus 로고
    • On-line learning of linear functions
    • Earlier version is Technical Report CRL-91-29 at UC Santa Cruz
    • N. Littlestone, P.M. Long, and M.K. Warmuth. On-line learning of linear functions. Journal of Computational Complexity, 5:1-23, 1995. Earlier version is Technical Report CRL-91-29 at UC Santa Cruz.
    • (1995) Journal of Computational Complexity , vol.5 , pp. 1-23
    • Littlestone, N.1    Long, P.M.2    Warmuth, M.K.3
  • 115
    • 0003488911 scopus 로고
    • Addison-Wesley Publishing Co., Reading, second edition, May Reprinted with corrections in May
    • D.G. Luenberger. Linear and Nonlinear Programming. Addison-Wesley Publishing Co., Reading, second edition, May 1984. Reprinted with corrections in May, 1989.
    • (1984) Linear and Nonlinear Programming
    • Luenberger, D.G.1
  • 116
    • 84937412009 scopus 로고    scopus 로고
    • A consistent strategy for boosting algorithms
    • Proceedings of the Annual Conference on Computational Learning Theory, Sydney, February Springer
    • Gabor Lugosi and Nicolas Vayatis. A consistent strategy for boosting algorithms. In Proceedings of the Annual Conference on Computational Learning Theory, volume 2375 of LNAI, pages 303-318, Sydney, February 2002. Springer.
    • (2002) LNAI , vol.2375 , pp. 303-318
    • Lugosi, G.1    Vayatis, N.2
  • 117
    • 0026678659 scopus 로고
    • On the convergence of coordinate descent method for convex differentiable minimization
    • Z.-Q. Luo and P. Tseng. On the convergence of coordinate descent method for convex differentiable minimization. Journal of Optimization Theory and Applications, 72(1):7-35, 1992.
    • (1992) Journal of Optimization Theory and Applications , vol.72 , Issue.1 , pp. 7-35
    • Luo, Z.-Q.1    Tseng, P.2
  • 118
    • 0027842081 scopus 로고
    • Matching Pursuits with time-frequency dictionaries
    • December
    • S. Mallat and Z. Zhang. Matching Pursuits with time-frequency dictionaries. IEEE Transactions on Signal Processing, 41(12):3397-3415, December 1993.
    • (1993) IEEE Transactions on Signal Processing , vol.41 , Issue.12 , pp. 3397-3415
    • Mallat, S.1    Zhang, Z.2
  • 119
    • 0000963583 scopus 로고
    • Linear and nonlinear separation of patterns by linear programming
    • O.L. Mangasarian. Linear and nonlinear separation of patterns by linear programming. Operations Research, 13:444-452, 1965.
    • (1965) Operations Research , vol.13 , pp. 444-452
    • Mangasarian, O.L.1
  • 120
    • 0032686461 scopus 로고    scopus 로고
    • Arbitrary-norm separating plane
    • O.L. Mangasarian. Arbitrary-norm separating plane. Operation Research Letters, 24(1):15-23, 1999.
    • (1999) Operation Research Letters , vol.24 , Issue.1 , pp. 15-23
    • Mangasarian, O.L.1
  • 122
    • 0036643066 scopus 로고    scopus 로고
    • On the existence of weak learners and applications to boosting
    • S. Manner and R. Meir. On the existence of weak learners and applications to boosting. Machine Learning, 48(1-3) :219-251, 2002.
    • (2002) Machine Learning , vol.48 , Issue.1-3 , pp. 219-251
    • Manner, S.1    Meir, R.2
  • 123
    • 84937440094 scopus 로고    scopus 로고
    • The consistency of greedy algorithms for classification
    • Procedings COLT'02, Sydney, Springer
    • S. Manner, R. Meir, and T. Zhang. The consistency of greedy algorithms for classification. In Procedings COLT'02, volume 2375 of LNAI, pages 319-333, Sydney, 2002. Springer.
    • (2002) LNAI , vol.2375 , pp. 319-333
    • Manner, S.1    Meir, R.2    Zhang, T.3
  • 124
    • 26944434501 scopus 로고    scopus 로고
    • PhD thesis, Australian National University, September
    • L. Mason. Margins and Combined Classifiers. PhD thesis, Australian National University, September 1999.
    • (1999) Margins and Combined Classifiers
    • Mason, L.1
  • 126
    • 0002550596 scopus 로고    scopus 로고
    • Functional gradient techniques for combining hypotheses
    • A. J. Smola, P.L. Bartlett, B. Schölkopf, and C. Schuurmans, editors, MIT Press, Cambridge, MA
    • L. Mason, J. Baxter, P.L. Bartlett, and M. Prean. Functional gradient techniques for combining hypotheses. In A. J. Smola, P.L. Bartlett, B. Schölkopf, and C. Schuurmans, editors, Advances in Large Margin Classifiers. MIT Press, Cambridge, MA, 1999.
    • (1999) Advances in Large Margin Classifiers
    • Mason, L.1    Baxter, J.2    Bartlett, P.L.3    Prean, M.4
  • 127
    • 0002550596 scopus 로고    scopus 로고
    • Functional gradient techniques for combining hypotheses
    • A. J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, MIT Press, Cambridge, MA
    • L. Mason, J. Baxter, P.L. Bartlett, and M. Prean. Functional gradient techniques for combining hypotheses. In A. J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, pages 221-247. MIT Press, Cambridge, MA, 2000.
    • (2000) Advances in Large Margin Classifiers , pp. 221-247
    • Mason, L.1    Baxter, J.2    Bartlett, P.L.3    Prean, M.4
  • 129
    • 0001919730 scopus 로고    scopus 로고
    • Localized boosting
    • San Francisco, Morgan Kaufmann
    • R. Meir, R. El-Yaniv, and Shai Ben-David. Localized boosting. In Proc. COLT, pages 190-199, San Francisco, 2000. Morgan Kaufmann.
    • (2000) Proc. COLT , pp. 190-199
    • Meir, R.1    El-Yaniv, R.2    Ben-David, S.3
  • 131
    • 0001500115 scopus 로고
    • Functions of positive and negative type and their connection with the theory of integral equations
    • J. Mercer. Functions of positive and negative type and their connection with the theory of integral equations. Philos. Trans. Roy. Soc. London, A 209:415-446, 1909.
    • (1909) Philos. Trans. Roy. Soc. London, A , vol.209 , pp. 415-446
    • Mercer, J.1
  • 132
    • 84957030794 scopus 로고    scopus 로고
    • Tuning cost-sensitive boosting and its application to melanoma diagnosis
    • J. Kittler and F. Roli, editors, Proceedings of the 2nd Internationa Workshop on Multiple Classifier Systems MCS2001, Springer
    • S. Merler, C. Furlanello, B. Larcher, and A. Sboner. Tuning cost-sensitive boosting and its application to melanoma diagnosis. In J. Kittler and F. Roli, editors, Proceedings of the 2nd Internationa Workshop on Multiple Classifier Systems MCS2001, volume 2096 of LNCS, pages 32-42. Springer, 2001.
    • (2001) LNCS , vol.2096 , pp. 32-42
    • Merler, S.1    Furlanello, C.2    Larcher, B.3    Sboner, A.4
  • 133
    • 0000902690 scopus 로고
    • The effective number of parameters: An analysis of generalization and regularization in non-linear learning systems
    • S. J. Hanson J. Moody and R. P. Lippman, editors, San Mateo, CA, Morgan Kaufman
    • J. Moody. The effective number of parameters: An analysis of generalization and regularization in non-linear learning systems. In S. J. Hanson J. Moody and R. P. Lippman, editors, Advances in Neural information processings systems, volume 4, pages 847-854, San Mateo, CA, 1992. Morgan Kaufman.
    • (1992) Advances in Neural Information Processings Systems , vol.4 , pp. 847-854
    • Moody, J.1
  • 135
    • 0028544395 scopus 로고
    • Network information criterion - Determining the number of hidden units for an artificial neural network model
    • N. Murata, S. Amari, and S. Yoshizawa. Network information criterion - determining the number of hidden units for an artificial neural network model. IEEE Transactions on Neural Networks, 5:865-872, 1994.
    • (1994) IEEE Transactions on Neural Networks , vol.5 , pp. 865-872
    • Murata, N.1    Amari, S.2    Yoshizawa, S.3
  • 137
    • 84945298125 scopus 로고    scopus 로고
    • A robust boosting algorithm
    • Proc. 13th European Conference on Machine Learning, Helsinki, Springer Verlag
    • Richard Nock and Patrice Lefaucheur. A robust boosting algorithm. In Proc. 13th European Conference on Machine Learning, volume LNAI 2430, Helsinki, 2002. Springer Verlag.
    • (2002) LNAI , vol.2430
    • Nock, R.1    Lefaucheur, P.2
  • 138
    • 0342749314 scopus 로고    scopus 로고
    • An asymptotic analysis of AdaBoost in the binary classification case
    • L. Niklasson, M. Bodén, and T. Ziemke, editors, March
    • T. Onoda, G. Rätsch, and K.-R. Müller. An asymptotic analysis of AdaBoost in the binary classification case. In L. Niklasson, M. Bodén, and T. Ziemke, editors, Proc. of the Int. Conf. on Artificial Neural Networks (ICANN'98), pages 195-200, March 1998.
    • (1998) Proc. of the Int. Conf. on Artificial Neural Networks (ICANN'98) , pp. 195-200
    • Onoda, T.1    Rätsch, G.2    Müller, K.-R.3
  • 139
    • 0005977611 scopus 로고    scopus 로고
    • A non-intrusive monitoring system for household electric appliances with inverters
    • H. Bothe and R. Rojas, editors, I Berlin, ICSC Academic Press Canada/Switzerland
    • T. Onoda, G. Rätsch, and K.-R. Müller. A non-intrusive monitoring system for household electric appliances with inverters. In H. Bothe and R. Rojas, editors, I Proc. of NC'2000, Berlin, 2000. ICSC Academic Press Canada/Switzerland.
    • (2000) Proc. of NC'2000
    • Onoda, T.1    Rätsch, G.2    Müller, K.-R.3
  • 140
    • 14344254250 scopus 로고    scopus 로고
    • Featureboost: A meta-learning algorithm that improves model robustness
    • Morgan Kaufmann
    • J. O'Sullivan, J. Langford, R. Caruana, and A. Blum. Featureboost: A meta-learning algorithm that improves model robustness. In Proceedings, 17th ICML. Morgan Kaufmann, 2000.
    • (2000) Proceedings, 17th ICML
    • O'Sullivan, J.1    Langford, J.2    Caruana, R.3    Blum, A.4
  • 141
    • 0035789318 scopus 로고    scopus 로고
    • Experimental comparisons of online and batch versions of bagging and boosting
    • N. Oza and S. Russell. Experimental comparisons of online and batch versions of bagging and boosting. In Proc. KDD-01, 2001.
    • (2001) Proc. KDD-01
    • Oza, N.1    Russell, S.2
  • 143
    • 0025056697 scopus 로고
    • Regularization algorithms for learning that are equivalent to multilayer networks
    • T. Poggio and F. Girosi. Regularization algorithms for learning that are equivalent to multilayer networks. Science, 247:978-982, 1990.
    • (1990) Science , vol.247 , pp. 978-982
    • Poggio, T.1    Girosi, F.2
  • 146
    • 4243791869 scopus 로고    scopus 로고
    • Master's thesis, Dep. of Computer Science, University of Potsdam, April In German
    • G. Rätsch. Ensemble learning methods for classification. Master's thesis, Dep. of Computer Science, University of Potsdam, April 1998. In German.
    • (1998) Ensemble Learning Methods for Classification
    • Rätsch, G.1
  • 147
    • 0004311187 scopus 로고    scopus 로고
    • PhD thesis, University of Potsdam, Computer Science Dept., August-Bebel-Str. 89, 14482 Potsdam, Germany, October
    • G. Rätsch. Robust Boosting via Convex Optimization. PhD thesis, University of Potsdam, Computer Science Dept., August-Bebel-Str. 89, 14482 Potsdam, Germany, October 2001.
    • (2001) Robust Boosting Via Convex Optimization
    • Rätsch, G.1
  • 148
    • 35248860949 scopus 로고    scopus 로고
    • Robustes boosting durch konvexe optimierung
    • D. Wagner et al., editor, Ausgezeichnete Informatikdissertationen 2001, Bonner Köllen
    • G. Rätsch. Robustes boosting durch konvexe optimierung. In D. Wagner et al., editor, Ausgezeichnete Informatikdissertationen 2001, volume D-2 of GI-Edition - Lecture Notes in Informatics (LNI), pages 125-136. Bonner Köllen, 2002.
    • (2002) GI-Edition - Lecture Notes in Informatics (LNI) , vol.D-2 , pp. 125-136
    • Rätsch, G.1
  • 149
    • 0036643047 scopus 로고    scopus 로고
    • Sparse regression ensembles in infinite and finite hypothesis spaces
    • Special Issue on New Methods for Model Selection and Model Combination. Also NeuroCOLT2 Technical Report NC-TR-2000-085
    • G. Rätsch, A. Demiriz, and K. Bennett. Sparse regression ensembles in infinite and finite hypothesis spaces. Machine Learning, 48(1-3):193-221, 2002. Special Issue on New Methods for Model Selection and Model Combination. Also NeuroCOLT2 Technical Report NC-TR-2000-085.
    • (2002) Machine Learning , vol.48 , Issue.1-3 , pp. 193-221
    • Rätsch, G.1    Demiriz, A.2    Bennett, K.3
  • 150
    • 0036709275 scopus 로고    scopus 로고
    • Constructing boosting algorithms from SVMs: An application to one-class classification
    • September In press. Earlier version is GMD TechReport No. 119, 2000
    • G. Rätsch, S. Mika, B. Schölkopf, and K.-R. Müller. Constructing boosting algorithms from SVMs: an application to one-class classification. IEEE PAMI, 24(9), September 2002. In press. Earlier version is GMD TechReport No. 119, 2000.
    • (2002) IEEE PAMI , vol.24 , Issue.9
    • Rätsch, G.1    Mika, S.2    Schölkopf, B.3    Müller, K.-R.4
  • 151
    • 0005011124 scopus 로고    scopus 로고
    • NeuroCOLT2 Technical Report 98, Royal Holloway College, London, August A short version appeared in NIPS 14, MIT Press
    • G. Rätsch, S. Mika, and M.K. Warmuth. On the convergence of leveraging. NeuroCOLT2 Technical Report 98, Royal Holloway College, London, August 2001. A short version appeared in NIPS 14, MIT Press, 2002.
    • (2001) On the Convergence of Leveraging
    • Rätsch, G.1    Mika, S.2    Warmuth, M.K.3
  • 152
    • 48849114845 scopus 로고    scopus 로고
    • On the convergence of leveraging
    • T.G. Dietterich, S. Becker, and Z. Ghahramani, editors, In press. Longer version also I NeuroCOLT Technical Report NC-TR-2001-098
    • G. Rätsch, S. Mika, and M.K. Warmuth. On the convergence of leveraging. In T.G. Dietterich, S. Becker, and Z. Ghahramani, editors, Advances in Neural I information processings systems, volume 14, 2002. In press. Longer version also I NeuroCOLT Technical Report NC-TR-2001-098.
    • (2002) Advances in Neural I Information Processings Systems , vol.14
    • Rätsch, G.1    Mika, S.2    Warmuth, M.K.3
  • 153
    • 0342502195 scopus 로고    scopus 로고
    • Soft margins for AdaBoost
    • March also NeuroCOLT Technical Report NC-TR-1998-021
    • G. Rätsch, T. Onoda, and K.-R. Müller. Soft margins for AdaBoost. Machine Learning, 42(3):287-320, March 2001. also NeuroCOLT Technical Report NC-TR-1998-021.
    • (2001) Machine Learning , vol.42 , Issue.3 , pp. 287-320
    • Rätsch, G.1    Onoda, T.2    Müller, K.-R.3
  • 154
    • 0002829165 scopus 로고    scopus 로고
    • Robust ensemble learning
    • A.J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, MIT Press, Cambridge, MA
    • G. Rätsch, B. Schölkopf, A.J. Smola, S. Mika, T. Onoda, and K.-R. Müller. Robust ensemble learning. In A.J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, pages 207-219. MIT Press, Cambridge, MA, 2000.
    • (2000) Advances in Large Margin Classifiers , pp. 207-219
    • Rätsch, G.1    Schölkopf, B.2    Smola, A.J.3    Mika, S.4    Onoda, T.5    Müller, K.-R.6
  • 155
    • 84899023418 scopus 로고    scopus 로고
    • Adapting codes and embeddings for polychotomies
    • MIT Press, accepted
    • G. Rätsch, A.J. Smola, and S. Mika. Adapting codes and embeddings for polychotomies. In NIPS, volume 15. MIT Press, 2003. accepted.
    • (2003) NIPS , vol.15
    • Rätsch, G.1    Smola, A.J.2    Mika, S.3
  • 157
    • 84937423775 scopus 로고    scopus 로고
    • Maximizing the margin with boosting
    • Proc. COLT, Sydney, Springer
    • G. Rätsch and M.K. Warmuth. Maximizing the margin with boosting. In Proc. COLT, volume 2375 of LNAI, pages 319-333, Sydney, 2002. Springer.
    • (2002) LNAI , vol.2375 , pp. 319-333
    • Rätsch, G.1    Warmuth, M.K.2
  • 159
    • 0018015137 scopus 로고
    • Modeling by shortest data description
    • J. Rissanen. Modeling by shortest data description. Automatica, 14:465-471, 1978.
    • (1978) Automatica , vol.14 , pp. 465-471
    • Rissanen, J.1
  • 163
    • 0025448521 scopus 로고
    • The strength of weak learnability
    • R.E. Schapire. The strength of weak learnability. Machine Learning, 5(2):197-227, 1990.
    • (1990) Machine Learning , vol.5 , Issue.2 , pp. 197-227
    • Schapire, R.E.1
  • 167
    • 0032280519 scopus 로고    scopus 로고
    • Boosting the margin: A new explanation for the effectiveness of voting methods
    • October
    • R.E. Schapire, Y. Freund, P.L. Bartlett, and W. S. Lee. Boosting the margin: A new explanation for the effectiveness of voting methods. The Annals of Statistics, 26(5):1651-1686, October 1998.
    • (1998) The Annals of Statistics , vol.26 , Issue.5 , pp. 1651-1686
    • Schapire, R.E.1    Freund, Y.2    Bartlett, P.L.3    Lee, W.S.4
  • 168
    • 0033281701 scopus 로고    scopus 로고
    • Improved boosting algorithms using confidence-rated predictions
    • December
    • R.E. Schapire and Y. Singer. Improved boosting algorithms using confidence-rated predictions. Machine Learning, 37(3):297-336, December 1999.
    • (1999) Machine Learning , vol.37 , Issue.3 , pp. 297-336
    • Schapire, R.E.1    Singer, Y.2
  • 170
    • 0033905095 scopus 로고    scopus 로고
    • Boostexter: A boosting-based system for text categorization
    • R.E. Schapire and Y. Singer. Boostexter: A boosting-based system for text categorization. Machine Learning, 39(2/3):135-168, 2000.
    • (2000) Machine Learning , vol.39 , Issue.2-3 , pp. 135-168
    • Schapire, R.E.1    Singer, Y.2
  • 173
    • 84865131152 scopus 로고    scopus 로고
    • A generalized representer theorem
    • D.P. Helmbold and R.C. Williamson, editors, COLT/EuroCOLT, Springer
    • B. Schölkopf, R. Herbrich, and A.J. Smola. A generalized representer theorem. In D.P. Helmbold and R.C. Williamson, editors, COLT/EuroCOLT, volume 2111 of LNAI, pages 416-426. Springer, 2001.
    • (2001) LNAI , vol.2111 , pp. 416-426
    • Schölkopf, B.1    Herbrich, R.2    Smola, A.J.3
  • 175
    • 17444438778 scopus 로고    scopus 로고
    • New support vector algorithms
    • also NeuroCOLT Technical Report NC-TR-1998-031
    • B. Schölkopf, A. Smola, R.C. Williamson, and P.L. Bartlett. New support vector algorithms. Neural Computation, 12:1207-1245, 2000. also NeuroCOLT Technical Report NC-TR-1998-031.
    • (2000) Neural Computation , vol.12 , pp. 1207-1245
    • Schölkopf, B.1    Smola, A.2    Williamson, R.C.3    Bartlett, P.L.4
  • 177
    • 0034243471 scopus 로고    scopus 로고
    • Boosting neural networks
    • H. Schwenk and Y. Bengio. Boosting neural networks. Neural Computation, 12(8):1869-1887, 2000.
    • (2000) Neural Computation , vol.12 , Issue.8 , pp. 1869-1887
    • Schwenk, H.1    Bengio, Y.2
  • 178
    • 2542505114 scopus 로고    scopus 로고
    • PAC analogoues of perceptron and winnow via boosting the margin
    • San Francisco, Morgan Kaufmann
    • R.A. Servedio. PAC analogoues of perceptron and winnow via boosting the margin. In Proc. COLT, pages 148-157, San Francisco, 2000. Morgan Kaufmann.
    • (2000) Proc. COLT , pp. 148-157
    • Servedio, R.A.1
  • 183
    • 14544284103 scopus 로고    scopus 로고
    • Towards a strategy for boosting regressors
    • A.J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, Advances Cambridge, MA, MIT Press
    • J. Shawe-Taylor and G. Karakoulas. Towards a strategy for boosting regressors. In A.J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, pages 247-258, Cambridge, MA, 2000. MIT Press.
    • (2000) Large Margin Classifiers , pp. 247-258
    • Shawe-Taylor, J.1    Karakoulas, G.2
  • 184
    • 0006413386 scopus 로고    scopus 로고
    • Leveraged vector machines
    • S.A. Solla, T.K. Leen, and K.-R. Müller, editors, MIT Press
    • Y. Singer. Leveraged vector machines. In S.A. Solla, T.K. Leen, and K.-R. Müller, editors, Advances in Neural Information Processing Systems, volume 12, pages 610-616. MIT Press, 2000.
    • (2000) Advances in Neural Information Processing Systems , vol.12 , pp. 610-616
    • Singer, Y.1
  • 185
    • 0001986205 scopus 로고    scopus 로고
    • Data domain description by support vectors
    • M. Verleysen, editor, Brussels, D. Facto Press
    • D. Tax and R. Duin. Data domain description by support vectors. In M. Verleysen, editor, Proc. ESANN, pages 251-256, Brussels, 1999. D. Facto Press.
    • (1999) Proc. ESANN , pp. 251-256
    • Tax, D.1    Duin, R.2
  • 186
    • 84945279418 scopus 로고    scopus 로고
    • Boosting density function estimators
    • PFOC. 13th European Conference on Machine Learning, Helsinki, Springer Verlag
    • F. Thollard, M. Sebban, and P. Ezequel. Boosting density function estimators. In PFOC. 13th European Conference on Machine Learning, volume LNAI 2430, pages 431-443, Helsinki, 2002. Springer Verlag.
    • (2002) LNAI , vol.2430 , pp. 431-443
    • Thollard, F.1    Sebban, M.2    Ezequel, P.3
  • 188
    • 0036130853 scopus 로고    scopus 로고
    • Subspace information criterion for non-quadratic regularizers - Model selection for sparse regressors
    • K. Tsuda, M. Sugiyama, and K.-R. Müller. Subspace information criterion for non-quadratic regularizers - model selection for sparse regressors. IEEE Transactions on Neural Networks, 13(1):70-80, 2002.
    • (2002) IEEE Transactions on Neural Networks , vol.13 , Issue.1 , pp. 70-80
    • Tsuda, K.1    Sugiyama, M.2    Müller, K.-R.3
  • 189
    • 0021518106 scopus 로고
    • A theory of the learnable
    • November
    • L.G. Valiant. A theory of the learnable. Communications of the ACM, 27(11):1134-1142, November 1984.
    • (1984) Communications of the ACM , vol.27 , Issue.11 , pp. 1134-1142
    • Valiant, L.G.1
  • 193
    • 0001024505 scopus 로고
    • On the uniform convergence of relative frequencies of events to their probabilities
    • V.N. Vapnik and A.Y. Chervonenkis. On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probab. and its Applications, 16(2):264-280, 1971.
    • (1971) Theory of Probab. and Its Applications , vol.16 , Issue.2 , pp. 264-280
    • Vapnik, V.N.1    Chervonenkis, A.Y.2
  • 194
    • 21244466146 scopus 로고
    • Zur Theorie der Gesellschaftsspiele
    • J. von Neumann. Zur Theorie der Gesellschaftsspiele. Math. Ann., 100:295-320, 1928.
    • (1928) Math. Ann. , vol.100 , pp. 295-320
    • Von Neumann, J.1
  • 196
    • 84898985725 scopus 로고    scopus 로고
    • A gradient-based boosting algorithm for regression problems
    • T.K. Leen, T.G. Dietterich, and V. Tresp, editors, MIT Press
    • R. Zemel and T. Pitassi. A gradient-based boosting algorithm for regression problems. In T.K. Leen, T.G. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems, volume 13, pages 696-702. MIT Press, 2001.
    • (2001) Advances in Neural Information Processing Systems , vol.13 , pp. 696-702
    • Zemel, R.1    Pitassi, T.2
  • 198
    • 0005085813 scopus 로고    scopus 로고
    • A general greedy approximation algorithm with applications
    • MIT Press
    • T. Zhang. A general greedy approximation algorithm with applications. In Advances in Neural Information Processing Systems, volume 14. MIT Press, 2002.
    • (2002) Advances in Neural Information Processing Systems , vol.14
    • Zhang, T.1
  • 199
    • 0036158505 scopus 로고    scopus 로고
    • On the dual formulation of regularized linear systems with convex risks
    • T. Zhang. On the dual formulation of regularized linear systems with convex risks. Machine Learning, 46:91-129, 2002.
    • (2002) Machine Learning , vol.46 , pp. 91-129
    • Zhang, T.1
  • 201
    • 0036146402 scopus 로고    scopus 로고
    • Lung cancer cell identification based on artificial neural network ensembles
    • Z.-H. Zhou, Y. Jiang, Y.-B. Yang, and S.-F. Chen. Lung cancer cell identification based on artificial neural network ensembles. Artificial Intelligence in Medicine, 24(1):25-36, 2002.
    • (2002) Artificial Intelligence in Medicine , vol.24 , Issue.1 , pp. 25-36
    • Zhou, Z.-H.1    Jiang, Y.2    Yang, Y.-B.3    Chen, S.-F.4


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.