-
1
-
-
2642651530
-
Missing plot techniques
-
Anderson, R. L. (1946). Missing plot techniques. Biometrics, 2, 41-47.
-
(1946)
Biometrics
, vol.2
, pp. 41-47
-
-
Anderson, R.L.1
-
2
-
-
0242498488
-
An analysis of four missing data treatment methods for supervised learning
-
5-6
-
Batista, G. E. A. P. A., & Monard, M. C. (2003). An analysis of four missing data treatment methods for supervised learning. Applied Artificial Intelligence, 17(5-6), 519-533.
-
(2003)
Applied Artificial Intelligence
, vol.17
, pp. 519-533
-
-
Batista, G.E.A.P.A.1
Monard, M.C.2
-
3
-
-
0002460150
-
The ALARM monitoring system: A case study with two probabilistic inference techniques for belief networks
-
Beinlich, I. A., Suermondt, H. J., Chavez, R. M., & Cooper, G. F. (1989). The ALARM monitoring system: A case study with two probabilistic inference techniques for belief networks. Proceedings of the Second European Conference on Artificial Intelligence in Medicine, 247-256.
-
(1989)
Proceedings of the Second European Conference on Artificial Intelligence in Medicine
, pp. 247-256
-
-
Beinlich, I.A.1
Suermondt, H.J.2
Chavez, R.M.3
Cooper, G.F.4
-
5
-
-
26944437287
-
Applications of Bayesian networks in meteorology
-
J. A. Gámez, et al. (Eds.) Springer-Verlag
-
Cano, R., Sordo, C., & Gutiérrez, J. M., (2004). Applications of Bayesian networks in meteorology. In J. A. Gámez, et al. (Eds.), Advances in Bayesian networks (pp. 309-327). Springer-Verlag.
-
(2004)
Advances in Bayesian Networks
, pp. 309-327
-
-
Cano, R.1
Sordo, C.2
Gutiérrez, J.M.3
-
7
-
-
34249832377
-
A Bayesian method for the induction of probabilistic networks from data
-
Cooper, G., & Herskovits, E. (1992). A Bayesian method for the induction of probabilistic networks from data. Machine Learning, 9, 309-347.
-
(1992)
Machine Learning
, vol.9
, pp. 309-347
-
-
Cooper, G.1
Herskovits, E.2
-
9
-
-
0002629270
-
Maximum likelihood from incomplete data via the em algorithm
-
Dempster, A. P., Laird, N. M., & Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society B, 39, 1-39.
-
(1977)
Journal of the Royal Statistical Society B
, vol.39
, pp. 1-39
-
-
Dempster, A.P.1
Laird, N.M.2
Rubin, D.B.3
-
10
-
-
2342606018
-
Bayesian networks for imputation
-
Part 2
-
Di Zio, M., Scanu, M., Coppola, L., Luzi, O., & Ponti, A. (2004). Bayesian networks for imputation. Journal of the Royal Statistical Society A, 167(Part 2), 309-322.
-
(2004)
Journal of the Royal Statistical Society A
, vol.167
, pp. 309-322
-
-
Di Zio, M.1
Scanu, M.2
Coppola, L.3
Luzi, O.4
Ponti, A.5
-
11
-
-
0032596380
-
SMILE: Structural Modeling, Inference, and Learning Engine and GeNIe: A development environment for graphical decision-theoretic models (Intelligent Systems Demonstration)
-
Menlo Park, CA: AAAI Press/The MIT Press
-
Druzdzel, M. J. (1999). SMILE: Structural Modeling, Inference, and Learning Engine and GeNIe: A development environment for graphical decision-theoretic models (Intelligent Systems Demonstration). In Proceedings of the sixteenth national conference on artificial intelligence (AAAI-99) (pp. 902-903). Menlo Park, CA: AAAI Press/The MIT Press.
-
(1999)
Proceedings of the Sixteenth National Conference on Artificial Intelligence (AAAI-99)
, pp. 902-903
-
-
Druzdzel, M.J.1
-
13
-
-
0030362553
-
Lazy decision trees
-
Cambridge, MA: AAAI Press/MIT Press
-
Friedman, H. F., Kohavi, R., & Yun, Y. (1996). Lazy decision trees. In Proceedings of the 13th national conference on artificial intelligence (pp. 717-724). Cambridge, MA: AAAI Press/MIT Press.
-
(1996)
Proceedings of the 13th National Conference on Artificial Intelligence
, pp. 717-724
-
-
Friedman, H.F.1
Kohavi, R.2
Yun, Y.3
-
14
-
-
0033691754
-
Using Bayesian networks to analyze expression data
-
New York: ACM Press
-
Friedman, N., Linial, M., Nachman, I., & Pe'er, D. (2000). Using Bayesian networks to analyze expression data. Proc. of the fourth international annual conference on computational molecular biology (pp. 127-135). New York: ACM Press.
-
(2000)
Proc. of the Fourth International Annual Conference on Computational Molecular Biology
, pp. 127-135
-
-
Friedman, N.1
Linial, M.2
Nachman, I.3
Pe'Er, D.4
-
15
-
-
0004012196
-
-
Chapman & Hall London
-
Gelman, A., Carlin, J. B., Stern, H. S., & Rubin, D. B. (1995). Bayesian data analysis. London: Chapman & Hall.
-
(1995)
Bayesian Data Analysis
-
-
Gelman, A.1
Carlin, J.B.2
Stern, H.S.3
Rubin, D.B.4
-
16
-
-
0012999549
-
-
(Tech. Rep. AI Lab Memo No. 1509, CBCL Paper N°. 108). MIT AI Lab
-
Ghahramami, Z., & Jordan, M. (1995). Learning from incomplete data (Tech. Rep. AI Lab Memo No. 1509, CBCL Paper N°. 108). MIT AI Lab.
-
(1995)
Learning from Incomplete Data
-
-
Ghahramami, Z.1
Jordan, M.2
-
18
-
-
0002388372
-
Strategies for improving MCMC
-
Chapman & Hall London
-
Gilks, W. R., & Roberts, G. O. (1996). Strategies for improving MCMC. In W. R. Gilks, S. Richardson, & D. J. Spiegelhalter (Eds.), Markov chain Monte Carlo in practice (pp. 89-114). London: Chapman & Hall.
-
(1996)
Markov Chain Monte Carlo in Practice
, pp. 89-114
-
-
Gilks, W.R.1
Roberts, G.O.2
Gilks, W.R.3
Richardson, S.4
Spiegelhalter, D.J.5
-
20
-
-
0003846041
-
A tutorial on learning Bayesian networks
-
Microsoft Research, Advanced Technology Division, Microsoft Corporation
-
Heckerman, D. (1995). A tutorial on learning Bayesian networks. Technical Report MSR-TR-95-06. Microsoft Research, Advanced Technology Division, Microsoft Corporation.
-
(1995)
Technical Report
, vol.MSR-TR-95-06
-
-
Heckerman, D.1
-
23
-
-
7444220224
-
Feature selection by Bayesian networks
-
Hruschka Jr., E. R., Hruschka, E. R., & Ebecken, N. F. F. (2004). Feature selection by Bayesian networks. Lecture Notes in Artificial Intelligence, 3060, 370-379.
-
(2004)
Lecture Notes in Artificial Intelligence
, vol.3060
, pp. 370-379
-
-
Hruschka Jr. E., R.1
Hruschka, E.R.2
Ebecken, N.F.F.3
-
24
-
-
2442705696
-
Genetic wrappers for feature selection in decision tree induction and variable ordering in Bayesian network structure learning
-
Hsu, W. H. (2004). Genetic wrappers for feature selection in decision tree induction and variable ordering in Bayesian network structure learning. Information Sciences, 163, 103-122.
-
(2004)
Information Sciences
, vol.163
, pp. 103-122
-
-
Hsu, W.H.1
-
25
-
-
0000262562
-
Hierarchical mixtures of experts and the em algorithm
-
Jordan, M., & Jacobs, R. (1994). Hierarchical mixtures of experts and the EM algorithm. Neural Computation, 6, 181-214.
-
(1994)
Neural Computation
, vol.6
, pp. 181-214
-
-
Jordan, M.1
Jacobs, R.2
-
26
-
-
0029617280
-
Convergence results for the em approach to mixtures of experts architectures
-
Jordan, M., & Xu, L. (1996). Convergence results for the EM approach to mixtures of experts architectures. Neural Networks, 8, 1409-1431.
-
(1996)
Neural Networks
, vol.8
, pp. 1409-1431
-
-
Jordan, M.1
Xu, L.2
-
27
-
-
0003563503
-
-
(Tech. Rep.). Ljubjana, Yogoslavia: Jozef Stefan Institute
-
Kononenko, I., Bratko, I., & Roskar, E. (1984). Experiments in automatic learning of medical diagnostic rules (Tech. Rep.). Ljubjana, Yogoslavia: Jozef Stefan Institute.
-
(1984)
Experiments in Automatic Learning of Medical Diagnostic Rules
-
-
Kononenko, I.1
Bratko, I.2
Roskar, E.3
-
28
-
-
0028482006
-
Learning Bayesian belief networks, an approach based on the MDL principle
-
Lam, W., & Bacchus, F. (1994). Learning Bayesian belief networks, an approach based on the MDL principle. Computational Intelligence, 10, 269-293.
-
(1994)
Computational Intelligence
, vol.10
, pp. 269-293
-
-
Lam, W.1
Bacchus, F.2
-
31
-
-
8344240433
-
The Hugin Tool for learning Bayesian Networks
-
Madsen, A. L., Lang, M., Kjærulff, U. B., & Jensen, F. (2003). The Hugin Tool for learning Bayesian Networks. Lecture Notes in Computer Science, 2711, 594-605.
-
(2003)
Lecture Notes in Computer Science
, vol.2711
, pp. 594-605
-
-
Madsen, A.L.1
Lang, M.2
Kjærulff, U.B.3
Jensen, F.4
-
32
-
-
0003408496
-
-
California: University of California, Department of Information and Computer Science
-
Merz, C. J., & Murphy, P. M. (1997). UCI Repository of Machine Learning Databases. Retrieved from http://www.ics.uci.edu. Irvine, California: University of California, Department of Information and Computer Science.
-
(1997)
UCI Repository of Machine Learning Databases
-
-
Merz, C.J.1
Murphy, P.M.2
-
34
-
-
0346083482
-
-
(Tech. Rep. CMU-CS-01-126). Doctoral dissertation, Computer Science Department, Carnegie Mellon University
-
Nigam, K. (2001). Using unlabeled data to improve text classification (Tech. Rep. CMU-CS-01-126). Doctoral dissertation, Computer Science Department, Carnegie Mellon University.
-
(2001)
Using Unlabeled Data to Improve Text Classification
-
-
Nigam, K.1
-
36
-
-
0040915482
-
Iterative procedures for missing values in experiments
-
Preece, A. D. (1971). Iterative procedures for missing values in experiments. Technometrics, 13, 743-753.
-
(1971)
Technometrics
, vol.13
, pp. 743-753
-
-
Preece, A.D.1
-
38
-
-
33744584654
-
Induction of decision trees
-
Quinlan, J. R. (1986). Induction of decision trees. Machine Learning, 1, 81-106.
-
(1986)
Machine Learning
, vol.1
, pp. 81-106
-
-
Quinlan, J.R.1
-
40
-
-
0021404166
-
Mixture densities, maximum likelihood and the em algorithm
-
2
-
Redner, R., & Walker, H. (1984). Mixture densities, maximum likelihood and the EM algorithm. SIAM Review, 26(2), 152-239.
-
(1984)
SIAM Review
, vol.26
, pp. 152-239
-
-
Redner, R.1
Walker, H.2
-
41
-
-
0017133178
-
Inference and missing data
-
Rubin, D. B. (1976). Inference and missing data. Biometrika, 63, 581-592.
-
(1976)
Biometrika
, vol.63
, pp. 581-592
-
-
Rubin, D.B.1
-
42
-
-
0001354633
-
Formalizing subjective notion about the effects of nonrespondents in samples surveys
-
Rubin, D. B. (1977). Formalizing subjective notion about the effects of nonrespondents in samples surveys. Journal of the American Statistical Association, 72, 538-543.
-
(1977)
Journal of the American Statistical Association
, vol.72
, pp. 538-543
-
-
Rubin, D.B.1
-
45
-
-
85047673373
-
Missing data: Our view of the state of the art
-
2
-
Schafer, J. L., & Graham, J. W. (2002). Missing data: Our view of the state of the art. Psychological Methods, 7(2), 147-177.
-
(2002)
Psychological Methods
, vol.7
, pp. 147-177
-
-
Schafer, J.L.1
Graham, J.W.2
-
46
-
-
0000120766
-
Estimating the dimension of a model
-
Schwartz, G. (1978). Estimating the dimension of a model. Annals of Statistics, 6, 461-464.
-
(1978)
Annals of Statistics
, vol.6
, pp. 461-464
-
-
Schwartz, G.1
-
48
-
-
84986980101
-
Sequential updating of conditional probability on direct graphical structures
-
Spiegelhalter, D. J., & Lauritzen, S. L. (1990). Sequential updating of conditional probability on direct graphical structures. Networks, 20, 576-606.
-
(1990)
Networks
, vol.20
, pp. 576-606
-
-
Spiegelhalter, D.J.1
Lauritzen, S.L.2
-
49
-
-
0000605742
-
Computation on Bayesian graphical models
-
Spiegelhalter, D. J., Thomas, A., & Best, N. G. (1996). Computation on Bayesian graphical models. Bayesian Statistics, 5, 407-425. Retrieved from http://www.mrc.bsu.cam.ac.uk/bugs.
-
(1996)
Bayesian Statistics
, vol.5
, pp. 407-425
-
-
Spiegelhalter, D.J.1
Thomas, A.2
Best, N.G.3
-
50
-
-
0003470083
-
-
MRC Biostatistics Unit Cambridge, UK
-
Spiegelhalter, D. J., Thomas, A., & Best, N. G. (1999). WINBUGS: Bayesian inference using Gibbs sampling, Version 1.3. Cambridge, UK: MRC Biostatistics Unit.
-
(1999)
WINBUGS: Bayesian Inference Using Gibbs Sampling, Version 1.3
-
-
Spiegelhalter, D.J.1
Thomas, A.2
Best, N.G.3
-
52
-
-
84950758368
-
The calculation of posterior distributions by data augmentation (with discussion)
-
Tanner, M. A., & Wong, W. H. (1987). The calculation of posterior distributions by data augmentation (with discussion). Journal of the American Statistical Association, 82, 528-550.
-
(1987)
Journal of the American Statistical Association
, vol.82
, pp. 528-550
-
-
Tanner, M.A.1
Wong, W.H.2
-
53
-
-
0011761172
-
Probabilistic induction by dynamic path generation in virtual trees
-
Cambridge University Press Cambridge
-
White, A. P. (1987). Probabilistic induction by dynamic path generation in virtual trees. In M. A. Bramer (Ed.), Research and development in expert systems III, (pp. 35-46). Cambridge: Cambridge University Press.
-
(1987)
Research and Development in Expert Systems III
, pp. 35-46
-
-
White, A.P.1
Bramer, M.A.2
-
55
-
-
0002210265
-
On the convergence properties of the em algorithm
-
1
-
Wu, C. F. J. (1983). On the convergence properties of the EM algorithm. The Annals of Statistics, 11(1), 95-103.
-
(1983)
The Annals of Statistics
, vol.11
, pp. 95-103
-
-
Wu, C.F.J.1
-
56
-
-
2342533082
-
On convergence properties of the em algorithm for Gaussian mixtures
-
Xu, L., & Jordan, M. (1996). On convergence properties of the EM algorithm for Gaussian mixtures. Neural Computation, 8, 129-151.
-
(1996)
Neural Computation
, vol.8
, pp. 129-151
-
-
Xu, L.1
Jordan, M.2
|