-
4
-
-
0001237218
-
A maximum likelihood methodology for clusterwise linear regression
-
W.S. DeSarbo and W.L. Cron. A maximum likelihood methodology for clusterwise linear regression. Journal of Classification, 5:249-282, 1988.
-
(1988)
Journal of Classification
, vol.5
, pp. 249-282
-
-
Desarbo, W.S.1
Cron, W.L.2
-
6
-
-
0032269108
-
How many clusters? Which clustering method? Answers via model-based cluster analysis
-
C. Fraley and A.E. Raftery. How many clusters? Which clustering method? Answers via model-based cluster analysis. Computer Journal, 41:578-588, 1998.
-
(1998)
Computer Journal
, vol.41
, pp. 578-588
-
-
Fraley, C.1
Raftery, A.E.2
-
7
-
-
0000125534
-
Sample selection bias as a specification error
-
J. J. Heckman. Sample selection bias as a specification error. Econometrica, 47:153-162, 1979.
-
(1979)
Econometrica
, vol.47
, pp. 153-162
-
-
Heckman, J.J.1
-
8
-
-
0034340248
-
Identifiability of models for clusterwise linear regressions
-
C. Hennig. Identifiability of models for clusterwise linear regressions. Journal of Classification, 17:273-296, 2000.
-
(2000)
Journal of Classification
, vol.17
, pp. 273-296
-
-
Hennig, C.1
-
10
-
-
0001940458
-
Adaptive mixtures of local experts
-
R.A. Jacobs, M.I. Jordan, S.J. Nowlan, and G.E. Hinton. Adaptive mixtures of local experts. Neural Computation, 3:79-87, 1991.
-
(1991)
Neural Computation
, vol.3
, pp. 79-87
-
-
Jacobs, R.A.1
Jordan, M.I.2
Nowlan, S.J.3
Hinton, G.E.4
-
11
-
-
0000262562
-
Hierarchical mixtures of experts and the em algorithm
-
M. I. Jordan and R. A. Jacobs. Hierarchical mixtures of experts and the EM algorithm. Neural Computation, 6:181-214, 1994.
-
(1994)
Neural Computation
, vol.6
, pp. 181-214
-
-
Jordan, M.I.1
Jacobs, R.A.2
-
12
-
-
0003488623
-
Consistent estimation of the order of mixture models
-
Laboratoire Analyse et Probabilité
-
C. Keribin. Consistent estimation of the order of mixture models. Technical report, Université d'Evry-Val d'Essonne, Laboratoire Analyse et Probabilité, 1997.
-
(1997)
Technical Report, Université D'Evry-Val D'Essonne
-
-
Keribin, C.1
-
14
-
-
0037527188
-
Improving predictive inference under covariate shift by weighting the log-likelihood function
-
H. Shimodaira. Improving predictive inference under covariate shift by weighting the log-likelihood function. Journal of Statistical Planning and Inference, 90:227-244, 2000.
-
(2000)
Journal of Statistical Planning and Inference
, vol.90
, pp. 227-244
-
-
Shimodaira, H.1
-
15
-
-
34249027984
-
Input-dependent estimation of generalisation error under covariate shift
-
M. Sugiyama and K. -R. Müller. Input-dependent estimation of generalisation error under covariate shift. Statistics and Decisions, 23:249-279, 2005.
-
(2005)
Statistics and Decisions
, vol.23
, pp. 249-279
-
-
Sugiyama, M.1
Müller, K.-R.2
-
17
-
-
65449187348
-
A general non-parametric approach to unobserved heterogeneity in the analysis of event history data
-
J. Hagenaars and A. McCutcheon, editors. Cambridge University Press
-
J.K. Vermunt. A general non-parametric approach to unobserved heterogeneity in the analysis of event history data. In J. Hagenaars and A. McCutcheon, editors, Applied Latent Class Models. Cambridge University Press, 2002.
-
(2002)
Applied Latent Class Models
-
-
Vermunt, J.K.1
-
18
-
-
34249753047
-
A mixture likelihood approach for generalised linear models
-
M. Wedel and W.S. DeSarbo. A mixture likelihood approach for generalised linear models. Journal of Classification, 12:21-55, 1995.
-
(1995)
Journal of Classification
, vol.12
, pp. 21-55
-
-
Wedel, M.1
Desarbo, W.S.2
-
19
-
-
14344263218
-
Learning and evaluating classifiers under sample selection bias
-
B. Zadrozny. Learning and evaluating classifiers under sample selection bias. In Proceedings of ICML, 2004.
-
(2004)
Proceedings of ICML
-
-
Zadrozny, B.1
|