메뉴 건너뛰기




Volumn 3, Issue 2, 2003, Pages 323-359

The subspace information criterion for infinite dimensional hypothesis spaces

Author keywords

Cross validation; Finite sample statistics; Gaussian processes; Generalization error; Kernel regression; Model selection; Reproducing kernel Hilbert space; Subspace information criterion; Unbiased estimators

Indexed keywords

ERROR ANALYSIS; INFORMATION SCIENCE; LEARNING ALGORITHMS; THEOREM PROVING;

EID: 0041965965     PISSN: 15324435     EISSN: None     Source Type: Journal    
DOI: 10.1162/153244303765208412     Document Type: Article
Times cited : (15)

References (74)
  • 1
    • 0016355478 scopus 로고
    • A new look at the statistical model identification
    • H. Akaike. A new look at the statistical model identification. IEEE Transactions on Automatic Control, AC-19(6):716-723, 1974.
    • (1974) IEEE Transactions on Automatic Control , vol.AC-19 , Issue.6 , pp. 716-723
    • Akaike, H.1
  • 2
    • 51849177370 scopus 로고
    • Likelihood and the Bayes procedure
    • N. J. Bernardo, M. H. DeGroot, D. V. Lindley, and A. F. M. Smith, editors, Valencia. University Press
    • H. Akaike. Likelihood and the Bayes procedure. In N. J. Bernardo, M. H. DeGroot, D. V. Lindley, and A. F. M. Smith, editors, Bayesian Statistics, pages 141-166, Valencia, 1980. University Press.
    • (1980) Bayesian Statistics , pp. 141-166
    • Akaike, H.1
  • 7
    • 0009090307 scopus 로고
    • Bootstrap and cross-validation estimates of the prediction error for linear regression models
    • O. Bunke and B. Droge. Bootstrap and cross-validation estimates of the prediction error for linear regression models. Annals of Statistics, 12:1400-1424, 1984.
    • (1984) Annals of Statistics , vol.12 , pp. 1400-1424
    • Bunke, O.1    Droge, B.2
  • 8
    • 27144489164 scopus 로고    scopus 로고
    • A tutorial on support vector machines for pattern recognition
    • C. J. C. Burges. A tutorial on support vector machines for pattern recognition. Knowledge Discovery and Data Mining, 2(2):121-167, 1998.
    • (1998) Knowledge Discovery and Data Mining , vol.2 , Issue.2 , pp. 121-167
    • Burges, C.J.C.1
  • 12
    • 34250263445 scopus 로고
    • Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
    • P. Craven and G. Wahba. Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation. Numerische Mathematik, 31:377-403, 1979.
    • (1979) Numerische Mathematik , vol.31 , pp. 377-403
    • Craven, P.1    Wahba, G.2
  • 14
    • 0003833285 scopus 로고
    • Society for Industrial and Applied Mathematics, Philadelphia and Pennsylvania
    • I. Daubechies. Ten Lectures on Wavelets. Society for Industrial and Applied Mathematics, Philadelphia and Pennsylvania, 1992.
    • (1992) Ten Lectures on Wavelets
    • Daubechies, I.1
  • 15
    • 0003336572 scopus 로고    scopus 로고
    • A probabilistic theory of pattern recognition
    • Springer, New York
    • L. Devroye, L. Györfi, and G. Lugosi. A Probabilistic Theory of Pattern Recognition. Number 31 in Applications of mathematics. Springer, New York, 1996.
    • (1996) Applications of Mathematics , vol.31
    • Devroye, L.1    Györfi, L.2    Lugosi, G.3
  • 17
    • 0041958932 scopus 로고
    • Ideal spatial adaptation via wavelet shirinkage
    • D. L. Donoho and I. M. Johnstone. Ideal spatial adaptation via wavelet shirinkage. Biometrika, 81:425-455, 1994.
    • (1994) Biometrika , vol.81 , pp. 425-455
    • Donoho, D.L.1    Johnstone, I.M.2
  • 19
    • 0033640737 scopus 로고    scopus 로고
    • Statistical active learning in multilayer perceptrons
    • K. Fukumizu. Statistical active learning in multilayer perceptrons. IEEE Transactions on Neural Networks, 11(1):17-26, 2000.
    • (2000) IEEE Transactions on Neural Networks , vol.11 , Issue.1 , pp. 17-26
    • Fukumizu, K.1
  • 20
    • 0001942829 scopus 로고
    • Neural networks and the bias/variance dilemma
    • S. Geman, E. Bienenstock, and R. Doursat. Neural networks and the bias/variance dilemma. Neural Computation, 4(1):1-58, 1992.
    • (1992) Neural Computation , vol.4 , Issue.1 , pp. 1-58
    • Geman, S.1    Bienenstock, E.2    Doursat, R.3
  • 23
    • 0000249788 scopus 로고    scopus 로고
    • An equivalence between sparse approximation and support vector machines
    • F. Girosi. An equivalence between sparse approximation and support vector machines. Neural Computation, 10(6): 1455-1480, 1998.
    • (1998) Neural Computation , vol.10 , Issue.6 , pp. 1455-1480
    • Girosi, F.1
  • 25
    • 0001691634 scopus 로고    scopus 로고
    • Bias/variance decompositions for likelihood-based estimators
    • T. Heskes. Bias/variance decompositions for likelihood-based estimators. Neural Computation, 10(6):1425-1433, 1998.
    • (1998) Neural Computation , vol.10 , Issue.6 , pp. 1425-1433
    • Heskes, T.1
  • 30
    • 0000406385 scopus 로고
    • A correspondence between Bayesan estimation on stochastic processes and smoothing by splines
    • G. S. Kimeldorf and G. Wahba. A correspondence between Bayesan estimation on stochastic processes and smoothing by splines. Annals of Mathematical Statistics, 41(2):495-502, 1970.
    • (1970) Annals of Mathematical Statistics , vol.41 , Issue.2 , pp. 495-502
    • Kimeldorf, G.S.1    Wahba, G.2
  • 32
    • 0000512689 scopus 로고    scopus 로고
    • Generalized information criteria in model selection
    • S. Konishi and G. Kitagawa. Generalized information criteria in model selection. Biometrika, 83:875-890, 1996.
    • (1996) Biometrika , vol.83 , pp. 875-890
    • Konishi, S.1    Kitagawa, G.2
  • 33
    • 0001025418 scopus 로고
    • Bayesian interpolation
    • D. J. C. MacKay. Bayesian interpolation. Neural Computation, 4(3):415-447, 1992a.
    • (1992) Neural Computation , vol.4 , Issue.3 , pp. 415-447
    • MacKay, D.J.C.1
  • 34
    • 0000695404 scopus 로고
    • Information-based objective functions for active data selection
    • D. J. C. MacKay. Information-based objective functions for active data selection. Neural Computation, 4(4):590-604, 1992b.
    • (1992) Neural Computation , vol.4 , Issue.4 , pp. 590-604
    • MacKay, D.J.C.1
  • 38
    • 0028544395 scopus 로고
    • Network information criterion - Determining the number of hidden units for an artificial neural network model
    • N. Murata, S. Yoshizawa, and S. Amari. Network information criterion - Determining the number of hidden units for an artificial neural network model. IEEE Transactions on Neural Networks, 5(6):865-872, 1994.
    • (1994) IEEE Transactions on Neural Networks , vol.5 , Issue.6 , pp. 865-872
    • Murata, N.1    Yoshizawa, S.2    Amari, S.3
  • 39
    • 0001349324 scopus 로고
    • Asymptotic properties of criteria for selection of variables in multiple regression
    • R. Nishii. Asymptotic properties of criteria for selection of variables in multiple regression. Annals of Statistics, 12:758-765, 1984.
    • (1984) Annals of Statistics , vol.12 , pp. 758-765
    • Nishii, R.1
  • 40
    • 0003599567 scopus 로고    scopus 로고
    • Introduction to radial basis function networks
    • Center for Cognitive Science, University of Edinburgh
    • M. J. L. Orr. Introduction to radial basis function networks. Technical report, Center for Cognitive Science, University of Edinburgh, 1996. Available electronically at http://www.anc.ed.ac.uk/̃mjo/papers/intro.ps.gz.
    • (1996) Technical Report
    • Orr, M.J.L.1
  • 42
    • 0018015137 scopus 로고
    • Modeling by shortest data description
    • J. Rissanen. Modeling by shortest data description. Automatica, 14:465-471, 1978.
    • (1978) Automatica , vol.14 , pp. 465-471
    • Rissanen, J.1
  • 44
    • 0029777083 scopus 로고    scopus 로고
    • Fisher information and stochastic complexity
    • J. Rissanen. Fisher information and stochastic complexity. IEEE Transactions on Information Theory, IT-42(1):40-47, 1996.
    • (1996) IEEE Transactions on Information Theory , vol.IT-42 , Issue.1 , pp. 40-47
    • Rissanen, J.1
  • 50
    • 0000120766 scopus 로고
    • Estimating the dimension of a model
    • G. Schwarz. Estimating the dimension of a model. Annals of Statistics, 6:461-464, 1978.
    • (1978) Annals of Statistics , vol.6 , pp. 461-464
    • Schwarz, G.1
  • 51
    • 77956887130 scopus 로고
    • An optimal selection of regression variables
    • R. Shibata. An optimal selection of regression variables. Biometrika, 68(1):45-54, 1981.
    • (1981) Biometrika , vol.68 , Issue.1 , pp. 45-54
    • Shibata, R.1
  • 53
    • 0032098361 scopus 로고    scopus 로고
    • The connection between regularization operators and support vector kernels
    • A. J. Smola, B. Schölkopf, and K.-R. Müller. The connection between regularization operators and support vector kernels. Neural Networks, 11(4):637-649, 1998.
    • (1998) Neural Networks , vol.11 , Issue.4 , pp. 637-649
    • Smola, A.J.1    Schölkopf, B.2    Müller, K.-R.3
  • 54
    • 84963178774 scopus 로고
    • Further analysis of the data by Akaike's information criterion and the finite corrections
    • N. Sugiura. Further analysis of the data by Akaike's information criterion and the finite corrections. Communications in Statistics: Theory and Methods, 7(1):13-26, 1978.
    • (1978) Communications in Statistics: Theory and Methods , vol.7 , Issue.1 , pp. 13-26
    • Sugiura, N.1
  • 55
    • 0035445898 scopus 로고    scopus 로고
    • Subspace information criterion for image restoration - Optimizing parameters in linear filters
    • M. Sugiyama, D. Imaizumi, and H. Ogawa. Subspace information criterion for image restoration - Optimizing parameters in linear filters. IEICE Transactions on Information and Systems, E84-D (9): 1249-1256, 2001.
    • (2001) IEICE Transactions on Information and Systems , vol.E84-D , Issue.9 , pp. 1249-1256
    • Sugiyama, M.1    Imaizumi, D.2    Ogawa, H.3
  • 56
    • 0034567857 scopus 로고    scopus 로고
    • Incremental active learning for optimal generalization
    • M. Sugiyama and H. Ogawa. Incremental active learning for optimal generalization. Neural Computation, 12(12):2909-2940, 2000.
    • (2000) Neural Computation , vol.12 , Issue.12 , pp. 2909-2940
    • Sugiyama, M.1    Ogawa, H.2
  • 57
    • 0035434818 scopus 로고    scopus 로고
    • Subspace information criterion for model selection
    • M. Sugiyama and H. Ogawa. Subspace information criterion for model selection. Neural Computation, 13(8):1863-1889, 2001.
    • (2001) Neural Computation , vol.13 , Issue.8 , pp. 1863-1889
    • Sugiyama, M.1    Ogawa, H.2
  • 58
    • 0035989166 scopus 로고    scopus 로고
    • Optimal design of regularization term and regularization parameter by subspace information criterion
    • M. Sugiyama and H. Ogawa. Optimal design of regularization term and regularization parameter by subspace information criterion. Neural Networks, 15(3):349-361, 2002a.
    • (2002) Neural Networks , vol.15 , Issue.3 , pp. 349-361
    • Sugiyama, M.1    Ogawa, H.2
  • 60
    • 0036836339 scopus 로고    scopus 로고
    • A unified method for optimizing linear image restoration filters
    • M. Sugiyama and H. Ogawa. A unified method for optimizing linear image restoration filters. Signal Processing, 82(11):1773-1787, 2002c.
    • (2002) Signal Processing , vol.82 , Issue.11 , pp. 1773-1787
    • Sugiyama, M.1    Ogawa, H.2
  • 61
    • 0001930912 scopus 로고
    • Distribution of information statistics and validity criteria of models
    • In Japanese
    • K. Takeuchi. Distribution of information statistics and validity criteria of models. Mathematical Science, 153:12-18, 1976. (In Japanese).
    • (1976) Mathematical Science , vol.153 , pp. 12-18
    • Takeuchi, K.1
  • 62
    • 0036579343 scopus 로고    scopus 로고
    • Choosing the parameter of image restoration filters by modified subspace information criterion
    • A. Tanaka, H. Imai, and M. Miyakoshi. Choosing the parameter of image restoration filters by modified subspace information criterion. IEICE Transactions on Fundamentals, E85-A (5):1104-1110, 2002.
    • (2002) IEICE Transactions on Fundamentals , vol.E85-A , Issue.5 , pp. 1104-1110
    • Tanaka, A.1    Imai, H.2    Miyakoshi, M.3
  • 64
    • 0036130853 scopus 로고    scopus 로고
    • Subspace information criterion for non-quadratic regularizers - Model selection for sparse regressors
    • K. Tsuda, M. Sugiyama, and K.-R. Müller. Subspace information criterion for non-quadratic regularizers - Model selection for sparse regressors. IEEE Transactions on Neural Networks, 13(1):70-80, 2002.
    • (2002) IEEE Transactions on Neural Networks , vol.13 , Issue.1 , pp. 70-80
    • Tsuda, K.1    Sugiyama, M.2    Müller, K.-R.3
  • 65
    • 0034264380 scopus 로고    scopus 로고
    • Bounds on error expectation for support vector machines
    • V. Vapnik and O. Chapelle. Bounds on error expectation for support vector machines. Neural Computation, 12(9):2013-2036, 2000.
    • (2000) Neural Computation , vol.12 , Issue.9 , pp. 2013-2036
    • Vapnik, V.1    Chapelle, O.2
  • 69
    • 0003466536 scopus 로고
    • Society for Industrial and Applied Mathematics, Philadelphia and Pennsylvania
    • H. Wahba. Spline Model for Observational Data. Society for Industrial and Applied Mathematics, Philadelphia and Pennsylvania, 1990.
    • (1990) Spline Model for Observational Data
    • Wahba, H.1
  • 70
    • 0035316373 scopus 로고    scopus 로고
    • Algebraic analysis for non-identifiable learning machines
    • S. Watanabe. Algebraic analysis for non-identifiable learning machines. Neural Computation, 13(4):899-933, 2001.
    • (2001) Neural Computation , vol.13 , Issue.4 , pp. 899-933
    • Watanabe, S.1
  • 71
    • 0003017575 scopus 로고    scopus 로고
    • Prediction with Gaussian processes: From linear regression to linear prediction and beyond
    • M. I. Jordan, editor. The MIT Press, Cambridge
    • C. K. I. Williams. Prediction with Gaussian processes: From linear regression to linear prediction and beyond. In M. I. Jordan, editor, Learning in Graphical Models, pages 599-621. The MIT Press, Cambridge, 1998.
    • (1998) Learning in Graphical Models , pp. 599-621
    • Williams, C.K.I.1
  • 72
    • 85072768928 scopus 로고    scopus 로고
    • Gaussian processes for regression
    • D. S. Touretzky, M. C. Mozer, and M. E. Hasselmo, editors. The MIT Press
    • C. K. I. Williams and C. E. Rasmussen. Gaussian processes for regression. In D. S. Touretzky, M. C. Mozer, and M. E. Hasselmo, editors, Advances in Neural Information Processing Systems, volume 8, pages 514-520. The MIT Press, 1996.
    • (1996) Advances in Neural Information Processing Systems , vol.8 , pp. 514-520
    • Williams, C.K.I.1    Rasmussen, C.E.2
  • 73
    • 0000673452 scopus 로고
    • Bayesian regularization and pruning using a Laplace prior
    • P. M. Williams. Bayesian regularization and pruning using a Laplace prior. Neural Computation, 7(1):117-143, 1995.
    • (1995) Neural Computation , vol.7 , Issue.1 , pp. 117-143
    • Williams, P.M.1
  • 74
    • 0032121805 scopus 로고    scopus 로고
    • A decision-theoretic extension of stochastic complexity and its application to learning
    • K. Yamanishi. A decision-theoretic extension of stochastic complexity and its application to learning. IEEE Transactions on Information Theory, IT-44(4): 1424-1439, 1998.
    • (1998) IEEE Transactions on Information Theory , vol.IT-44 , Issue.4 , pp. 1424-1439
    • Yamanishi, K.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.