메뉴 건너뛰기




Volumn 32, Issue 8, 2010, Pages 1522-1528

Maximum likelihood model selection for 1-norm soft margin svms with multiple parameters

Author keywords

Maximum likelihood.; Model selection; Regularization; Support vector machines

Indexed keywords

BINARY CLASSIFICATION; COHERENT FRAMEWORKS; CONDITIONAL PROBABILITIES; GRADIENT-BASED OPTIMIZATION; HYPERPARAMETERS; LIKELIHOOD FUNCTIONS; LOGISTIC REGRESSIONS; MODEL SELECTION; MODEL SELECTION PROBLEM; MULTIPLE KERNELS; MULTIPLE PARAMETERS; OVERFITTING; PRIOR DISTRIBUTION; REGULARIZATION; SOFT MARGINS; STATE-OF-THE-ART METHODS; SVM MODEL;

EID: 77953812676     PISSN: 01628828     EISSN: None     Source Type: Journal    
DOI: 10.1109/TPAMI.2010.95     Document Type: Article
Times cited : (25)

References (36)
  • 2
    • 34249753618 scopus 로고
    • Support-vector networks
    • C. Cortes and V. Vapnik, "Support-Vector Networks," Machine Learning, vol.20, no.3, pp. 273-297, 1995.
    • (1995) Machine Learning , vol.20 , Issue.3 , pp. 273-297
    • Cortes, C.1    Vapnik, V.2
  • 3
    • 33846575231 scopus 로고    scopus 로고
    • Statistical learning theory
    • V. Vapnik, Statistical Learning Theory. Wiley, 1998.
    • (1998) Wiley
    • Vapnik, V.1
  • 6
    • 0034264380 scopus 로고    scopus 로고
    • Bounds on error expectation for support vector machines
    • V. Vapnik and O. Chapelle, "Bounds on Error Expectation for Support Vector Machines," Neural Computation, vol.12, pp. 2013-2036, 2000.
    • (2000) Neural Computation , vol.12 , pp. 2013-2036
    • Vapnik, V.1    Chapelle, O.2
  • 8
    • 0036161011 scopus 로고    scopus 로고
    • Choosing multiple parameters for support vector machines
    • DOI 10.1023/A:1012450327387
    • O. Chapelle, V. Vapnik, O. Bousquet, and S. Mukherjee, "Choosing Multiple Parameters for Support Vector Machines," Machine Learning, vol.46, no.1, pp. 131-159, 2002. (Pubitemid 34129966)
    • (2002) Machine Learning , vol.46 , Issue.1-3 , pp. 131-159
    • Chapelle, O.1    Vapnik, V.2    Bousquet, O.3    Mukherjee, S.4
  • 9
    • 0036738840 scopus 로고    scopus 로고
    • Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
    • Sept.
    • S.S. Keerthi, "Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms," IEEE Trans. Neural Networks, vol.13, no.5, pp. 1225-1229, Sept. 2002.
    • (2002) IEEE Trans. Neural Networks , vol.13 , Issue.5 , pp. 1225-1229
    • Keerthi, S.S.1
  • 10
    • 23944487822 scopus 로고    scopus 로고
    • Gradient-based adaptation of general gaussian kernels
    • T. Glasmachers and C. Igel, "Gradient-Based Adaptation of General Gaussian Kernels," Neural Computation, vol.17, no.10, pp. 2099-2105, 2005.
    • (2005) Neural Computation , vol.17 , Issue.10 , pp. 2099-2105
    • Glasmachers, T.1    Igel, C.2
  • 11
    • 84864039082 scopus 로고    scopus 로고
    • An efficient method for gradient-based adaptation of hyperparameters in SVM models
    • B. Schölkopf, J. Platt, and T. Hoffman, eds., MIT Press
    • S.S. Keerthi, V. Sindhwani, and O. Chapelle, "An Efficient Method for Gradient-Based Adaptation of Hyperparameters in SVM Models," Advances in Neural Information Processing Systems, vol.19, B. Schölkopf, J. Platt, and T. Hoffman, eds., MIT Press, 2007.
    • (2007) Advances in Neural Information Processing Systems , vol.19
    • Keerthi, S.S.1    Sindhwani, V.2    Chapelle, O.3
  • 12
    • 34248374186 scopus 로고    scopus 로고
    • Gradient-based optimization of kernel-target alignment for sequence kernels applied to bacterial gene start detection
    • Apr.
    • C. Igel, T. Glasmachers, B. Mersch, N. Pfeifer, and P. Meinicke, "Gradient-Based Optimization of Kernel-Target Alignment for Sequence Kernels Applied to Bacterial Gene Start Detection," IEEE/ACM Trans. Computational Biology and Bioinformatics, vol.4, no.2, pp. 216-226, Apr. 2007.
    • (2007) IEEE/ACM Trans. Computational Biology and Bioinformatics , vol.4 , Issue.2 , pp. 216-226
    • Igel, C.1    Glasmachers, T.2    Mersch, B.3    Pfeifer, N.4    Meinicke, P.5
  • 13
    • 0002709342 scopus 로고    scopus 로고
    • Feature selection via concave minimization and support vector machines
    • P.S. Bradley and O.L. Mangasarian, "Feature Selection via Concave Minimization and Support Vector Machines," Proc. Int'l Conf. Machine Learning, pp. 82-90, 1998.
    • (1998) Proc. Int'l Conf. Machine Learning , pp. 82-90
    • Bradley, P.S.1    Mangasarian, O.L.2
  • 14
    • 0003243224 scopus 로고    scopus 로고
    • Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods
    • MIT Press
    • J. Platt, "Probabilistic Outputs for Support Vector Machines and Comparisons to Regularized Likelihood Methods," Advances in Large Margin Classifiers, pp. 61-74, MIT Press, 1999.
    • (1999) Advances in Large Margin Classifiers , pp. 61-74
    • Platt, J.1
  • 15
    • 0001224048 scopus 로고    scopus 로고
    • Sparse bayesian learning and the relevance vector machine
    • M.E. Tipping, "Sparse Bayesian Learning and the Relevance Vector Machine," J. Machine Learning Research, vol.1, pp. 211-244, 2001.
    • (2001) J. Machine Learning Research , vol.1 , pp. 211-244
    • Tipping, M.E.1
  • 16
    • 4644257995 scopus 로고    scopus 로고
    • Statistical behavior and consistency of classification methods based on convex risk minimization
    • T. Zhang, "Statistical Behavior and Consistency of Classification Methods Based on Convex Risk Minimization," Annals of Statistics, vol.32, no.1, pp. 56-85, 2004.
    • (2004) Annals of Statistics , vol.32 , Issue.1 , pp. 56-85
    • Zhang, T.1
  • 17
    • 34247596518 scopus 로고    scopus 로고
    • Sparseness vs estimating conditional probabilities: Some asymptotic results
    • P.L. Bartlett and A. Tewari, "Sparseness vs Estimating Conditional Probabilities: Some Asymptotic Results," J. Machine Learning Research, vol.8, pp. 775-790, 2007.
    • (2007) J. Machine Learning Research , vol.8 , pp. 775-790
    • Bartlett, P.L.1    Tewari, A.2
  • 18
    • 0002755771 scopus 로고    scopus 로고
    • Gaussian process classification and svm: Mean field results
    • P. Bartlett, B. Schölkopf, D. Schuurmans, and A. Smola, eds., MIT Press
    • M. Opper and O. Winther, "Gaussian Process Classification and SVM: Mean Field Results," Advances in Large Margin Classifiers, P. Bartlett, B. Schölkopf, D. Schuurmans, and A. Smola, eds., MIT Press, 1999.
    • (1999) Advances in Large Margin Classifiers
    • Opper, M.1    Winther, O.2
  • 19
    • 84898947199 scopus 로고    scopus 로고
    • Bayesian model selection for support vector machines, gaussian processes and other kernel classifiers
    • MIT Press
    • M. Seeger, "Bayesian Model Selection for Support Vector Machines, Gaussian Processes and Other Kernel Classifiers," Advances in Neural Information Processing Systems, vol.12, pp. 603-609, MIT Press, 2000.
    • (2000) Advances in Neural Information Processing Systems , vol.12 , pp. 603-609
    • Seeger, M.1
  • 20
    • 0242288807 scopus 로고    scopus 로고
    • Model selection for support vector machine classification
    • C. Gold and P. Sollich, "Model Selection for Support Vector Machine Classification," Neurocomputing, vol.55, nos. 1-2, pp. 221-249, 2003.
    • (2003) Neurocomputing , vol.55 , Issue.1-2 , pp. 221-249
    • Gold, C.1    Sollich, P.2
  • 21
    • 34247558132 scopus 로고    scopus 로고
    • Preventing over-fitting during model selection via bayesian regularisation of the hyper-parameters
    • G.C. Cawley and N.L.C. Talbot, "Preventing Over-Fitting During Model Selection via Bayesian Regularisation of the Hyper-Parameters," J. Machine Learning Research, vol.8, pp. 841-861, 2007.
    • (2007) J. Machine Learning Research , vol.8 , pp. 841-861
    • Cawley, G.C.1    Talbot, N.L.C.2
  • 22
    • 33745784639 scopus 로고    scopus 로고
    • Maximum-gain working set selection for support vector machines
    • T. Glasmachers and C. Igel, "Maximum-Gain Working Set Selection for Support Vector Machines," J. Machine Learning Research, vol.7, pp. 1437-1466, 2006.
    • (2006) J. Machine Learning Research , vol.7 , pp. 1437-1466
    • Glasmachers, T.1    Igel, C.2
  • 24
    • 15844394276 scopus 로고    scopus 로고
    • Evolutionary tuning of multiple svm parameters
    • F. Friedrichs and C. Igel, "Evolutionary Tuning of Multiple SVM Parameters," Neurocomputing, vol.64, no.C, pp. 107-117, 2005.
    • (2005) Neurocomputing , vol.64 , Issue.C , pp. 107-117
    • Friedrichs, F.1    Igel, C.2
  • 25
    • 56449125243 scopus 로고    scopus 로고
    • Uncertainty handling in model selection for support vector machines
    • G. Rudolph, T. Jansen, S. Lucas, C. Poloni, and N. Beume, eds. Springer
    • T. Glasmachers and C. Igel, "Uncertainty Handling in Model Selection for Support Vector Machines," Parallel Problem Solving from Nature, G. Rudolph, T. Jansen, S. Lucas, C. Poloni, and N. Beume, eds., pp. 185-194, Springer, 2008.
    • (2008) Parallel Problem Solving from Nature , pp. 185-194
    • Glasmachers, T.1    Igel, C.2
  • 26
    • 0141430928 scopus 로고    scopus 로고
    • Radius margin bounds for support vector machines with the rbf kernel
    • K.M. Chung, W.C. Kao, C.L. Sun, L.L. Wang, and C.-J. Lin, "Radius Margin Bounds for Support Vector Machines with the RBF Kernel," Neural Computation, vol.15, no.11, pp. 2643-2681, 2003.
    • (2003) Neural Computation , vol.15 , Issue.11 , pp. 2643-2681
    • Chung, K.M.1    Kao, W.C.2    Sun, C.L.3    Wang, L.L.4    Lin, C.-J.5
  • 27
    • 0037382208 scopus 로고    scopus 로고
    • Evaluation of simple performance measures for tuning svm hyperparameters
    • K. Duan, S.S. Keerthi, and A.N. Poo, "Evaluation of Simple Performance Measures for Tuning SVM Hyperparameters," Neurocomputing, vol.51, no.1, pp. 41-60, 2003.
    • (2003) Neurocomputing , vol.51 , Issue.1 , pp. 41-60
    • Duan, K.1    Keerthi, S.S.2    Poo, A.N.3
  • 28
    • 34548160247 scopus 로고    scopus 로고
    • A note on platt's probabilistic outputs for support vector machines
    • H.-T. Lin, C.-J. Lin, and R.C. Weng, "A Note on Platt's Probabilistic Outputs for Support Vector Machines," Machine Learning, vol.68, pp. 267-276, 2007.
    • (2007) Machine Learning , vol.68 , pp. 267-276
    • Lin, H.-T.1    Lin, C.-J.2    Weng, R.C.3
  • 30
    • 0037168506 scopus 로고    scopus 로고
    • Soft and hard classification by reproducing kernel hilbert space methods
    • G. Wahba, "Soft and Hard Classification by Reproducing Kernel Hilbert Space Methods," Proc. Nat'l Academy of Sciences USA, vol.99, no.26, pp. 16524-16530, 2002.
    • (2002) Proc. Nat'l Academy of Sciences USA , vol.99 , Issue.26 , pp. 16524-16530
    • Wahba, G.1
  • 31
    • 43049121679 scopus 로고    scopus 로고
    • Efficient approximate leave-one-out cross-validation for kernel logistic regression
    • G.C. Cawley and N.L.C. Talbot, "Efficient Approximate Leave-One-Out Cross-Validation for Kernel Logistic Regression," Machine Learning, vol.71, no.2, pp. 243-264, 2008.
    • (2008) Machine Learning , vol.71 , Issue.2 , pp. 243-264
    • Cawley, G.C.1    Talbot, N.L.C.2
  • 32
    • 62149099100 scopus 로고    scopus 로고
    • Efficient covariance matrix update for variable metric evolution strategies
    • T. Suttorp, N. Hansen, and C. Igel, "Efficient Covariance Matrix Update for Variable Metric Evolution Strategies," Machine Learning, vol.75, no.2, pp. 167-197, 2009.
    • (2009) Machine Learning , vol.75 , Issue.2 , pp. 167-197
    • Suttorp, T.1    Hansen, N.2    Igel, C.3
  • 33
    • 0035377566 scopus 로고    scopus 로고
    • Completely derandomized self-adaptation in evolution strategies
    • N. Hansen and A. Ostermeier, "Completely Derandomized Self-Adaptation in Evolution Strategies," Evolutionary Computation, vol.9, no.2, pp. 159-195, 2001.
    • (2001) Evolutionary Computation , vol.9 , Issue.2 , pp. 159-195
    • Hansen, N.1    Ostermeier, A.2
  • 34
    • 0037238922 scopus 로고    scopus 로고
    • Empirical evaluation of the improved rprop learning algorithm
    • C. Igel and M. Hüsken, "Empirical Evaluation of the Improved Rprop Learning Algorithm," Neurocomputing, vol.50, pp. 105-123, 2003.
    • (2003) Neurocomputing , vol.50 , pp. 105-123
    • Igel, C.1    Hüsken, M.2
  • 35


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.