-
2
-
-
0002652285
-
A maximum entropy approach to natural language processing
-
March
-
Adam L. Berger, Stephen A. Della Pietra, and Vincent J. Della Pietra. 1996. A maximum entropy approach to natural language processing. Computational Linguistics, 22(1):39–71, March.
-
(1996)
Computational Linguistics
, vol.22
, Issue.1
, pp. 39-71
-
-
Berger, Adam L.1
Della Pietra, Stephen A.2
Della Pietra, Vincent J.3
-
4
-
-
84907334477
-
Statistical parsing with an automatically-extracted tree adjoining grammar
-
David Chiang. 2000. Statistical parsing with an automatically-extracted tree adjoining grammar. In Proc. Annual Meeting of ACL, pages 1–6.
-
(2000)
Proc. Annual Meeting of ACL
, pp. 1-6
-
-
Chiang, David1
-
5
-
-
85118152899
-
Three generative, lexicalised models for statistical parsing
-
Michael Collins. 1997. Three generative, lexicalised models for statistical parsing. In Proc. Annual Meeting of ACL, pages 16–23.
-
(1997)
Proc. Annual Meeting of ACL
, pp. 16-23
-
-
Collins, Michael1
-
6
-
-
0001573124
-
Generalized iterative scaling for log-linear model
-
J. N. Darroch and D. Ratcliff. 1972. Generalized iterative scaling for log-linear model. Ann. Math. Statist., 43:1470–1480.
-
(1972)
Ann. Math. Statist
, vol.43
, pp. 1470-1480
-
-
Darroch, J. N.1
Ratcliff, D.2
-
8
-
-
85072855288
-
A trainable rule-based algorithm for word segmentation
-
Madrid
-
David Palmer. 1997. A trainable rule-based algorithm for word segmentation. In Proc. Annual Meeting of ACL, Madrid.
-
(1997)
Proc. Annual Meeting of ACL
-
-
Palmer, David1
-
11
-
-
0001076101
-
A stochastic finite-state word-segmentation algorithm for Chinese
-
Richard Sproat, Chilin Shih, William Gale, and Nancy Chang. 1996. A stochastic finite-state word-segmentation algorithm for Chinese. Computational Linguistics, 22(3):377–404.
-
(1996)
Computational Linguistics
, vol.22
, Issue.3
, pp. 377-404
-
-
Sproat, Richard1
Shih, Chilin2
Gale, William3
Chang, Nancy4
-
12
-
-
85100864264
-
Improving chinese tokenization with linguistic filters on statistical lexical acquisition
-
Stuttgart
-
Dekai Wu and Pascale Fung. 1994. Improving chinese tokenization with linguistic filters on statistical lexical acquisition. In Fourth Conference on Applied Natural Language Processing, pages 180–181, Stuttgart.
-
(1994)
Fourth Conference on Applied Natural Language Processing
, pp. 180-181
-
-
Wu, Dekai1
Fung, Pascale2
-
14
-
-
84944934937
-
Developing guidelines and ensuring consistency for Chinese text annotation
-
F. Xia, M. Palmer, N. Xue, M.E. Okurowski, J. Kovarik, F.D. Chiou, S. Huang, T. Kroch, and M. Marcus. 2000. Developing guidelines and ensuring consistency for Chinese text annotation. In Proc of the 2nd Intl. Conf. on Language Resources and Evaluation (LREC 2000).
-
(2000)
Proc of the 2nd Intl. Conf. on Language Resources and Evaluation (LREC 2000)
-
-
Xia, F.1
Palmer, M.2
Xue, N.3
Okurowski, M.E.4
Kovarik, J.5
Chiou, F.D.6
Huang, S.7
Kroch, T.8
Marcus, M.9
|