-
3
-
-
0016102310
-
A projection pursuit algorithm for exploratory data analysis
-
Friedman, J. H., & Tukey, J. W. (1974). A projection pursuit algorithm for exploratory data analysis. IEEE Transactions on Computing, 23, 881-890.
-
(1974)
IEEE Transactions on Computing
, vol.23
, pp. 881-890
-
-
Friedman, J.H.1
Tukey, J.W.2
-
6
-
-
0000319005
-
The estimation of probability densities and cumulatives by Fourier series methods
-
Kronmal, R., & Tarter, M. (1968). The estimation of probability densities and cumulatives by Fourier series methods. Journal of the American Statistical Association, 63, 925-952.
-
(1968)
Journal of the American Statistical Association
, vol.63
, pp. 925-952
-
-
Kronmal, R.1
Tarter, M.2
-
8
-
-
0022776702
-
Can we solve the continuous Karhunen-Loève eigenproblem from discrete data?
-
Ogawa, H., & Oja, E. (1986). Can we solve the continuous Karhunen-Loève eigenproblem from discrete data? Transactions of the IECE of Japan, 69(9), 1020-1029.
-
(1986)
Transactions of the IECE of Japan
, vol.69
, Issue.9
, pp. 1020-1029
-
-
Ogawa, H.1
Oja, E.2
-
10
-
-
0006019710
-
An expectation maximisation approach to nonlinear component analysis
-
Rosipal, R., & Girolami, M. (2001). An expectation maximisation approach to nonlinear component analysis. Neural Computation, 13(3), 500-505.
-
(2001)
Neural Computation
, vol.13
, Issue.3
, pp. 500-505
-
-
Rosipal, R.1
Girolami, M.2
-
11
-
-
0003798627
-
-
Cambridge, MA: MIT Press
-
Schölkopf, B., Bruges, C., & Smola, A. (Eds.). (1999). Advances in kernel methods-Support vector learning. Cambridge, MA: MIT Press.
-
(1999)
Advances in Kernel Methods-Support Vector Learning
-
-
Schölkopf, B.1
Bruges, C.2
Smola, A.3
-
13
-
-
0347243182
-
Nonlinear component analysis as a kernel eigenvalue problem
-
Schölkopf, B., Smola, A., & Müller, K. R. (1998). Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10(5), 1299-1319.
-
(1998)
Neural Computation
, vol.10
, Issue.5
, pp. 1299-1319
-
-
Schölkopf, B.1
Smola, A.2
Müller, K.R.3
-
15
-
-
0039722607
-
The effect of the input density distribution on kernel-based classifiers
-
P. Langley (Ed.), San Mateo, CA: Morgan Kaufmann
-
Williams, C. K. I., & Seeger, M. (2000). The effect of the input density distribution on kernel-based classifiers. In P. Langley (Ed.), Proceedings of the Seventeenth International Conference on Machine Learning, 2000 (pp. 1159-1166). San Mateo, CA: Morgan Kaufmann.
-
(2000)
Proceedings of the Seventeenth International Conference on Machine Learning, 2000
, pp. 1159-1166
-
-
Williams, C.K.I.1
Seeger, M.2
-
16
-
-
84899010839
-
Using the Nyström method to speed up kernel machines
-
T. K. Leen, T. G. Dietterich, & V. Tresp (Eds.), Cambridge, MA: MIT Press
-
Williams, C. K. I., & Seeger, M. (2001). Using the Nyström method to speed up kernel machines. In T. K. Leen, T. G. Dietterich, & V. Tresp (Eds.), Advances in neural information processing systems, 13 (pp. 682-688). Cambridge, MA: MIT Press.
-
(2001)
Advances in Neural Information Processing Systems
, vol.13
, pp. 682-688
-
-
Williams, C.K.I.1
Seeger, M.2
-
17
-
-
0039813137
-
Gaussian regression and optimal finite dimensional linear models
-
C. M. Bishop (Ed.), Berlin: Springer-Verlag
-
Zhu, H., Williams, C. K. I., Rohwer, R. J., & Morciniec, M. (1998). Gaussian regression and optimal finite dimensional linear models. In C. M. Bishop (Ed.), Neural networks and machine learning. Berlin: Springer-Verlag.
-
(1998)
Neural Networks and Machine Learning
-
-
Zhu, H.1
Williams, C.K.I.2
Rohwer, R.J.3
Morciniec, M.4
|