-
1
-
-
0016355478
-
A new look at the statistical model identification
-
H. Akaike. A new look at the statistical model identification. IEEE Transactions on Automatic Control, AC-19(6):716-723, 1974.
-
(1974)
IEEE Transactions on Automatic Control
, vol.AC-19
, Issue.6
, pp. 716-723
-
-
Akaike, H.1
-
2
-
-
51849177370
-
Likelihood and the Bayes procedure
-
N. J. Bernardo, M. H. DeGroot, D. V. Lindley, and A. F. M. Smith, editors, Valencia. University Press
-
H. Akaike. Likelihood and the Bayes procedure. In N. J. Bernardo, M. H. DeGroot, D. V. Lindley, and A. F. M. Smith, editors, Bayesian Statistics, pages 141-166, Valencia, 1980. University Press.
-
(1980)
Bayesian Statistics
, pp. 141-166
-
-
Akaike, H.1
-
4
-
-
0000660569
-
Four types of learning curves
-
S. Amari, N. Fujita, and S. Shinomoto. Four types of learning curves. Neural Computation, 4(4):605-618, 1992.
-
(1992)
Neural Computation
, vol.4
, Issue.4
, pp. 605-618
-
-
Amari, S.1
Fujita, N.2
Shinomoto, S.3
-
7
-
-
0009090307
-
Bootstrap and cross-validation estimates of the prediction error for linear regression models
-
O. Bunke and B. Droge. Bootstrap and cross-validation estimates of the prediction error for linear regression models. Annals of Statistics, 12:1400-1424, 1984.
-
(1984)
Annals of Statistics
, vol.12
, pp. 1400-1424
-
-
Bunke, O.1
Droge, B.2
-
8
-
-
27144489164
-
A tutorial on support vector machines for pattern recognition
-
C. J. C. Burges. A tutorial on support vector machines for pattern recognition. Knowledge Discovery and Data Mining, 2(2):121-167, 1998.
-
(1998)
Knowledge Discovery and Data Mining
, vol.2
, Issue.2
, pp. 121-167
-
-
Burges, C.J.C.1
-
9
-
-
0032131292
-
Atomic decomposition by basis pursuit
-
S. S. Chen, D. L. Donoho, and M. A. Saunders. Atomic decomposition by basis pursuit. SIAM Journal on Scientific Computing, 20(1):33-61, 1998.
-
(1998)
SIAM Journal on Scientific Computing
, vol.20
, Issue.1
, pp. 33-61
-
-
Chen, S.S.1
Donoho, D.L.2
Saunders, M.A.3
-
10
-
-
0032595046
-
Model complexity control for regression using VC generalization bounds
-
V. Cherkassky, X. Shao, F. M. Mulier, and V. N. Vapnik. Model complexity control for regression using VC generalization bounds. IEEE Transactions on Neural Networks, 10(5):1075-1089, 1999.
-
(1999)
IEEE Transactions on Neural Networks
, vol.10
, Issue.5
, pp. 1075-1089
-
-
Cherkassky, V.1
Shao, X.2
Mulier, F.M.3
Vapnik, V.N.4
-
12
-
-
34250263445
-
Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
-
P. Craven and G. Wahba. Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation. Numerische Mathematik, 31:377-403, 1979.
-
(1979)
Numerische Mathematik
, vol.31
, pp. 377-403
-
-
Craven, P.1
Wahba, G.2
-
14
-
-
0003833285
-
-
Society for Industrial and Applied Mathematics, Philadelphia and Pennsylvania
-
I. Daubechies. Ten Lectures on Wavelets. Society for Industrial and Applied Mathematics, Philadelphia and Pennsylvania, 1992.
-
(1992)
Ten Lectures on Wavelets
-
-
Daubechies, I.1
-
15
-
-
0003336572
-
A probabilistic theory of pattern recognition
-
Springer, New York
-
L. Devroye, L. Györfi, and G. Lugosi. A Probabilistic Theory of Pattern Recognition. Number 31 in Applications of mathematics. Springer, New York, 1996.
-
(1996)
Applications of Mathematics
, vol.31
-
-
Devroye, L.1
Györfi, L.2
Lugosi, G.3
-
17
-
-
0041958932
-
Ideal spatial adaptation via wavelet shirinkage
-
D. L. Donoho and I. M. Johnstone. Ideal spatial adaptation via wavelet shirinkage. Biometrika, 81:425-455, 1994.
-
(1994)
Biometrika
, vol.81
, pp. 425-455
-
-
Donoho, D.L.1
Johnstone, I.M.2
-
19
-
-
0033640737
-
Statistical active learning in multilayer perceptrons
-
K. Fukumizu. Statistical active learning in multilayer perceptrons. IEEE Transactions on Neural Networks, 11(1):17-26, 2000.
-
(2000)
IEEE Transactions on Neural Networks
, vol.11
, Issue.1
, pp. 17-26
-
-
Fukumizu, K.1
-
20
-
-
0001942829
-
Neural networks and the bias/variance dilemma
-
S. Geman, E. Bienenstock, and R. Doursat. Neural networks and the bias/variance dilemma. Neural Computation, 4(1):1-58, 1992.
-
(1992)
Neural Computation
, vol.4
, Issue.1
, pp. 1-58
-
-
Geman, S.1
Bienenstock, E.2
Doursat, R.3
-
23
-
-
0000249788
-
An equivalence between sparse approximation and support vector machines
-
F. Girosi. An equivalence between sparse approximation and support vector machines. Neural Computation, 10(6): 1455-1480, 1998.
-
(1998)
Neural Computation
, vol.10
, Issue.6
, pp. 1455-1480
-
-
Girosi, F.1
-
25
-
-
0001691634
-
Bias/variance decompositions for likelihood-based estimators
-
T. Heskes. Bias/variance decompositions for likelihood-based estimators. Neural Computation, 10(6):1425-1433, 1998.
-
(1998)
Neural Computation
, vol.10
, Issue.6
, pp. 1425-1433
-
-
Heskes, T.1
-
30
-
-
0000406385
-
A correspondence between Bayesan estimation on stochastic processes and smoothing by splines
-
G. S. Kimeldorf and G. Wahba. A correspondence between Bayesan estimation on stochastic processes and smoothing by splines. Annals of Mathematical Statistics, 41(2):495-502, 1970.
-
(1970)
Annals of Mathematical Statistics
, vol.41
, Issue.2
, pp. 495-502
-
-
Kimeldorf, G.S.1
Wahba, G.2
-
32
-
-
0000512689
-
Generalized information criteria in model selection
-
S. Konishi and G. Kitagawa. Generalized information criteria in model selection. Biometrika, 83:875-890, 1996.
-
(1996)
Biometrika
, vol.83
, pp. 875-890
-
-
Konishi, S.1
Kitagawa, G.2
-
33
-
-
0001025418
-
Bayesian interpolation
-
D. J. C. MacKay. Bayesian interpolation. Neural Computation, 4(3):415-447, 1992a.
-
(1992)
Neural Computation
, vol.4
, Issue.3
, pp. 415-447
-
-
MacKay, D.J.C.1
-
34
-
-
0000695404
-
Information-based objective functions for active data selection
-
D. J. C. MacKay. Information-based objective functions for active data selection. Neural Computation, 4(4):590-604, 1992b.
-
(1992)
Neural Computation
, vol.4
, Issue.4
, pp. 590-604
-
-
MacKay, D.J.C.1
-
37
-
-
0035272287
-
An introduction to kernel-based learning algorithms
-
K.-R. Müller, S. Mika, G. Rätsch, K. Tsuda, and B. Schölkopf. An introduction to kernel-based learning algorithms. IEEE Transactions on Neural Networks, 12(2):181-201, 2001.
-
(2001)
IEEE Transactions on Neural Networks
, vol.12
, Issue.2
, pp. 181-201
-
-
Müller, K.-R.1
Mika, S.2
Rätsch, G.3
Tsuda, K.4
Schölkopf, B.5
-
38
-
-
0028544395
-
Network information criterion - Determining the number of hidden units for an artificial neural network model
-
N. Murata, S. Yoshizawa, and S. Amari. Network information criterion - Determining the number of hidden units for an artificial neural network model. IEEE Transactions on Neural Networks, 5(6):865-872, 1994.
-
(1994)
IEEE Transactions on Neural Networks
, vol.5
, Issue.6
, pp. 865-872
-
-
Murata, N.1
Yoshizawa, S.2
Amari, S.3
-
39
-
-
0001349324
-
Asymptotic properties of criteria for selection of variables in multiple regression
-
R. Nishii. Asymptotic properties of criteria for selection of variables in multiple regression. Annals of Statistics, 12:758-765, 1984.
-
(1984)
Annals of Statistics
, vol.12
, pp. 758-765
-
-
Nishii, R.1
-
40
-
-
0003599567
-
Introduction to radial basis function networks
-
Center for Cognitive Science, University of Edinburgh
-
M. J. L. Orr. Introduction to radial basis function networks. Technical report, Center for Cognitive Science, University of Edinburgh, 1996. Available electronically at http://www.anc.ed.ac.uk/̃mjo/papers/intro.ps.gz.
-
(1996)
Technical Report
-
-
Orr, M.J.L.1
-
41
-
-
0004214436
-
-
C. E. Rasmussen, R. M. Neal, G. E. Hinton, D. van Camp, M. Revow, Z. Ghahramani, R. Kustra, and R. Tibshirani. The DELVE manual, 1996. Available electronically at http://www.cs.toronto.edu/̃delve/.
-
(1996)
The DELVE Manual
-
-
Rasmussen, C.E.1
Neal, R.M.2
Hinton, G.E.3
Van Camp, D.4
Revow, M.5
Ghahramani, Z.6
Kustra, R.7
Tibshirani, R.8
-
42
-
-
0018015137
-
Modeling by shortest data description
-
J. Rissanen. Modeling by shortest data description. Automatica, 14:465-471, 1978.
-
(1978)
Automatica
, vol.14
, pp. 465-471
-
-
Rissanen, J.1
-
44
-
-
0029777083
-
Fisher information and stochastic complexity
-
J. Rissanen. Fisher information and stochastic complexity. IEEE Transactions on Information Theory, IT-42(1):40-47, 1996.
-
(1996)
IEEE Transactions on Information Theory
, vol.IT-42
, Issue.1
, pp. 40-47
-
-
Rissanen, J.1
-
47
-
-
0003798627
-
-
The MIT Press, Cambridge, MA
-
B. Schölkopf, C. J. C. Burges, and A. J. Smola, editors. Advances in Kernel Methods: Support Vector Machines. The MIT Press, Cambridge, MA, 1998.
-
(1998)
Advances in Kernel Methods: Support Vector Machines
-
-
Schölkopf, B.1
Burges, C.J.C.2
Smola, A.J.3
-
48
-
-
17444438778
-
New support vector algorithms
-
B. Schölkopf, A. Smola, R. Williamson, and P. Bartlett. New support vector algorithms. Neural Computation, 12(5):1207-1245, 2000.
-
(2000)
Neural Computation
, vol.12
, Issue.5
, pp. 1207-1245
-
-
Schölkopf, B.1
Smola, A.2
Williamson, R.3
Bartlett, P.4
-
50
-
-
0000120766
-
Estimating the dimension of a model
-
G. Schwarz. Estimating the dimension of a model. Annals of Statistics, 6:461-464, 1978.
-
(1978)
Annals of Statistics
, vol.6
, pp. 461-464
-
-
Schwarz, G.1
-
51
-
-
77956887130
-
An optimal selection of regression variables
-
R. Shibata. An optimal selection of regression variables. Biometrika, 68(1):45-54, 1981.
-
(1981)
Biometrika
, vol.68
, Issue.1
, pp. 45-54
-
-
Shibata, R.1
-
53
-
-
0032098361
-
The connection between regularization operators and support vector kernels
-
A. J. Smola, B. Schölkopf, and K.-R. Müller. The connection between regularization operators and support vector kernels. Neural Networks, 11(4):637-649, 1998.
-
(1998)
Neural Networks
, vol.11
, Issue.4
, pp. 637-649
-
-
Smola, A.J.1
Schölkopf, B.2
Müller, K.-R.3
-
54
-
-
84963178774
-
Further analysis of the data by Akaike's information criterion and the finite corrections
-
N. Sugiura. Further analysis of the data by Akaike's information criterion and the finite corrections. Communications in Statistics: Theory and Methods, 7(1):13-26, 1978.
-
(1978)
Communications in Statistics: Theory and Methods
, vol.7
, Issue.1
, pp. 13-26
-
-
Sugiura, N.1
-
55
-
-
0035445898
-
Subspace information criterion for image restoration - Optimizing parameters in linear filters
-
M. Sugiyama, D. Imaizumi, and H. Ogawa. Subspace information criterion for image restoration - Optimizing parameters in linear filters. IEICE Transactions on Information and Systems, E84-D (9): 1249-1256, 2001.
-
(2001)
IEICE Transactions on Information and Systems
, vol.E84-D
, Issue.9
, pp. 1249-1256
-
-
Sugiyama, M.1
Imaizumi, D.2
Ogawa, H.3
-
56
-
-
0034567857
-
Incremental active learning for optimal generalization
-
M. Sugiyama and H. Ogawa. Incremental active learning for optimal generalization. Neural Computation, 12(12):2909-2940, 2000.
-
(2000)
Neural Computation
, vol.12
, Issue.12
, pp. 2909-2940
-
-
Sugiyama, M.1
Ogawa, H.2
-
57
-
-
0035434818
-
Subspace information criterion for model selection
-
M. Sugiyama and H. Ogawa. Subspace information criterion for model selection. Neural Computation, 13(8):1863-1889, 2001.
-
(2001)
Neural Computation
, vol.13
, Issue.8
, pp. 1863-1889
-
-
Sugiyama, M.1
Ogawa, H.2
-
58
-
-
0035989166
-
Optimal design of regularization term and regularization parameter by subspace information criterion
-
M. Sugiyama and H. Ogawa. Optimal design of regularization term and regularization parameter by subspace information criterion. Neural Networks, 15(3):349-361, 2002a.
-
(2002)
Neural Networks
, vol.15
, Issue.3
, pp. 349-361
-
-
Sugiyama, M.1
Ogawa, H.2
-
60
-
-
0036836339
-
A unified method for optimizing linear image restoration filters
-
M. Sugiyama and H. Ogawa. A unified method for optimizing linear image restoration filters. Signal Processing, 82(11):1773-1787, 2002c.
-
(2002)
Signal Processing
, vol.82
, Issue.11
, pp. 1773-1787
-
-
Sugiyama, M.1
Ogawa, H.2
-
61
-
-
0001930912
-
Distribution of information statistics and validity criteria of models
-
In Japanese
-
K. Takeuchi. Distribution of information statistics and validity criteria of models. Mathematical Science, 153:12-18, 1976. (In Japanese).
-
(1976)
Mathematical Science
, vol.153
, pp. 12-18
-
-
Takeuchi, K.1
-
62
-
-
0036579343
-
Choosing the parameter of image restoration filters by modified subspace information criterion
-
A. Tanaka, H. Imai, and M. Miyakoshi. Choosing the parameter of image restoration filters by modified subspace information criterion. IEICE Transactions on Fundamentals, E85-A (5):1104-1110, 2002.
-
(2002)
IEICE Transactions on Fundamentals
, vol.E85-A
, Issue.5
, pp. 1104-1110
-
-
Tanaka, A.1
Imai, H.2
Miyakoshi, M.3
-
64
-
-
0036130853
-
Subspace information criterion for non-quadratic regularizers - Model selection for sparse regressors
-
K. Tsuda, M. Sugiyama, and K.-R. Müller. Subspace information criterion for non-quadratic regularizers - Model selection for sparse regressors. IEEE Transactions on Neural Networks, 13(1):70-80, 2002.
-
(2002)
IEEE Transactions on Neural Networks
, vol.13
, Issue.1
, pp. 70-80
-
-
Tsuda, K.1
Sugiyama, M.2
Müller, K.-R.3
-
65
-
-
0034264380
-
Bounds on error expectation for support vector machines
-
V. Vapnik and O. Chapelle. Bounds on error expectation for support vector machines. Neural Computation, 12(9):2013-2036, 2000.
-
(2000)
Neural Computation
, vol.12
, Issue.9
, pp. 2013-2036
-
-
Vapnik, V.1
Chapelle, O.2
-
69
-
-
0003466536
-
-
Society for Industrial and Applied Mathematics, Philadelphia and Pennsylvania
-
H. Wahba. Spline Model for Observational Data. Society for Industrial and Applied Mathematics, Philadelphia and Pennsylvania, 1990.
-
(1990)
Spline Model for Observational Data
-
-
Wahba, H.1
-
70
-
-
0035316373
-
Algebraic analysis for non-identifiable learning machines
-
S. Watanabe. Algebraic analysis for non-identifiable learning machines. Neural Computation, 13(4):899-933, 2001.
-
(2001)
Neural Computation
, vol.13
, Issue.4
, pp. 899-933
-
-
Watanabe, S.1
-
71
-
-
0003017575
-
Prediction with Gaussian processes: From linear regression to linear prediction and beyond
-
M. I. Jordan, editor. The MIT Press, Cambridge
-
C. K. I. Williams. Prediction with Gaussian processes: From linear regression to linear prediction and beyond. In M. I. Jordan, editor, Learning in Graphical Models, pages 599-621. The MIT Press, Cambridge, 1998.
-
(1998)
Learning in Graphical Models
, pp. 599-621
-
-
Williams, C.K.I.1
-
72
-
-
85072768928
-
Gaussian processes for regression
-
D. S. Touretzky, M. C. Mozer, and M. E. Hasselmo, editors. The MIT Press
-
C. K. I. Williams and C. E. Rasmussen. Gaussian processes for regression. In D. S. Touretzky, M. C. Mozer, and M. E. Hasselmo, editors, Advances in Neural Information Processing Systems, volume 8, pages 514-520. The MIT Press, 1996.
-
(1996)
Advances in Neural Information Processing Systems
, vol.8
, pp. 514-520
-
-
Williams, C.K.I.1
Rasmussen, C.E.2
-
73
-
-
0000673452
-
Bayesian regularization and pruning using a Laplace prior
-
P. M. Williams. Bayesian regularization and pruning using a Laplace prior. Neural Computation, 7(1):117-143, 1995.
-
(1995)
Neural Computation
, vol.7
, Issue.1
, pp. 117-143
-
-
Williams, P.M.1
-
74
-
-
0032121805
-
A decision-theoretic extension of stochastic complexity and its application to learning
-
K. Yamanishi. A decision-theoretic extension of stochastic complexity and its application to learning. IEEE Transactions on Information Theory, IT-44(4): 1424-1439, 1998.
-
(1998)
IEEE Transactions on Information Theory
, vol.IT-44
, Issue.4
, pp. 1424-1439
-
-
Yamanishi, K.1
|