메뉴 건너뛰기




Volumn , Issue , 2012, Pages 11-19

Large, pruned or continuous space language models on a GPU for statistical machine translation

Author keywords

[No Author keywords available]

Indexed keywords

COMPUTATIONAL LINGUISTICS; GRAPHICS PROCESSING UNIT; MACHINE TRANSLATION; PROGRAM PROCESSORS; SPEECH RECOGNITION; SPEECH TRANSMISSION;

EID: 85045980083     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (77)

References (27)
  • 1
    • 84899005563 scopus 로고    scopus 로고
    • A neural probabilistic language model
    • Yoshua Bengio and Rejean Ducharme. 2001. A neural probabilistic language model. In NIPS, volume 13, pages 932-938.
    • (2001) NIPS , vol.13 , pp. 932-938
    • Bengio, Y.1    Ducharme, R.2
  • 2
    • 0142166851 scopus 로고    scopus 로고
    • A neural probabilistic language model
    • Yoshua Bengio, Rejean Ducharme, Pascal Vincent, and Christian Jauvin. 2003. A neural probabilistic language model. JMLR, 3(2):1137-1155.
    • (2003) JMLR , vol.3 , Issue.2 , pp. 1137-1155
    • Bengio, Y.1    Ducharme, R.2    Vincent, P.3    Jauvin, C.4
  • 4
    • 80053375619 scopus 로고    scopus 로고
    • Large language models in machine translation
    • Thorsten Brants, Ashok C. Popat, Peng Xu, Franz J. Och, and Jeffrey Dean. 2007. Large language models in machine translation. In EMNLP, pages 858-867.
    • (2007) EMNLP , pp. 858-867
    • Brants, T.1    Popat, A.C.2    Xu, P.3    Och, F.J.4    Dean, J.5
  • 5
    • 0033329799 scopus 로고    scopus 로고
    • An empirical study of smoothing techniques for language modeling
    • Stanley F. Chen and Joshua T. Goodman. 1999. An empirical study of smoothing techniques for language modeling. Computer Speech & Language, 13(4):359-394.
    • (1999) Computer Speech & Language , vol.13 , Issue.4 , pp. 359-394
    • Chen, S.F.1    Goodman, J.T.2
  • 6
    • 85121307493 scopus 로고    scopus 로고
    • Efficient handling of n-gram language models for statistical machine translation
    • Marcello Federico and Maura Cettolo. 2007. Efficient handling of n-gram language models for statistical machine translation. In Second Workshop on SMT, pages 88-95.
    • (2007) Second Workshop on SMT , pp. 88-95
    • Federico, M.1    Cettolo, M.2
  • 7
    • 84982842007 scopus 로고    scopus 로고
    • KenLM: Faster and smaller language model queries
    • Kenneth Heafield. 2011. KenLM: Faster and smaller language model queries. In Sixth Workshop on SMT, pages 187-197.
    • (2011) Sixth Workshop on SMT , pp. 187-197
    • Heafield, K.1
  • 9
    • 80051613059 scopus 로고    scopus 로고
    • Improved models for mandarin speech-to-text transcription
    • L. Lamel, J.-L. Gauvain, V.-B. Le, I. Oparin, and S. Meng. 2011. Improved models for mandarin speech-to-text transcription. In ICASSP, pages 4660-4663.
    • (2011) ICASSP , pp. 4660-4663
    • Lamel, L.1    Gauvain, J.-L.2    Le, V.-B.3    Oparin, I.4    Meng, S.5
  • 10
    • 80053276362 scopus 로고    scopus 로고
    • Training continuous space language models: Some practical issues
    • H.S. Le, A. Allauzen, G.Wisniewski, and F. Yvon. 2010. Training continuous space language models: Some practical issues. In EMNLP, pages 778-788.
    • (2010) EMNLP , pp. 778-788
    • Le, H.S.1    Allauzen, A.2    Wisniewski, G.3    Yvon, F.4
  • 11
    • 80051619076 scopus 로고    scopus 로고
    • Structured output layer neural network language model
    • H.S. Le, I. Oparin, A. Allauzen, J-L. Gauvain, and F. Yvon. 2011a. Structured output layer neural network language model. In ICASSP, pages 5524-5527.
    • (2011) ICASSP , pp. 5524-5527
    • Le, H.S.1    Oparin, I.2    Allauzen, A.3    Gauvain, J.-L.4    Yvon, F.5
  • 13
    • 84865740155 scopus 로고    scopus 로고
    • Improving LVCSR system combination using neural network language model cross adaptation
    • X. Liu, M. J. F. Gales, and P. C. Woodland. 2011. Improving LVCSR system combination using neural network language model cross adaptation. In Interspeech, pages 2857-2860.
    • (2011) Interspeech , pp. 2857-2860
    • Liu, X.1    Gales, M.J.F.2    Woodland, P.C.3
  • 16
    • 73249116200 scopus 로고    scopus 로고
    • A scalable hierarchical distributed language model
    • Andriy Mnih and Geoffrey Hinton. 2008. A scalable hierarchical distributed language model. In NIPS.
    • (2008) NIPS
    • Mnih, A.1    Hinton, G.2
  • 17
    • 84859981825 scopus 로고    scopus 로고
    • Intelligent selection of language model training data
    • Robert C. Moore and William Lewis. 2010. Intelligent selection of language model training data. In ACL, pages 220-224.
    • (2010) ACL , pp. 220-224
    • Moore, R.C.1    Lewis, W.2
  • 19
    • 79959850026 scopus 로고    scopus 로고
    • Improved neural network based language modelling and adaptation
    • Junho Park, Xunying Liu, Mark J. F. Gales, and Phil C. Woodland. 2010. Improved neural network based language modelling and adaptation. In Interspeech, pages 1041-1044.
    • (2010) Interspeech , pp. 1041-1044
    • Park, J.1    Liu, X.2    Gales, M.J.F.3    Woodland, P.C.4
  • 20
    • 84867198805 scopus 로고    scopus 로고
    • Data selection and smoothing in an open-source system for the 2008 NIST machine translation evaluation
    • Holger Schwenk and Yannick Estève. 2008. Data selection and smoothing in an open-source system for the 2008 NIST machine translation evaluation. In Interspeech, pages 2727-2730.
    • (2008) Interspeech , pp. 2727-2730
    • Schwenk, H.1    Estève, Y.2
  • 21
    • 0036293862 scopus 로고    scopus 로고
    • Connectionist language modeling for large vocabulary continuous speech recognition
    • Holger Schwenk and Jean-Luc Gauvain. 2002. Connectionist language modeling for large vocabulary continuous speech recognition. In ICASSP, pages I: 765-768.
    • (2002) ICASSP , vol.1 , pp. 765-768
    • Schwenk, H.1    Gauvain, J.2
  • 23
    • 10944267136 scopus 로고    scopus 로고
    • Efficient training of large neural networks for language modeling
    • Holger Schwenk. 2004. Efficient training of large neural networks for language modeling. In IJCNN, pages 3059-3062.
    • (2004) IJCNN , pp. 3059-3062
    • Schwenk, H.1
  • 24
    • 33847610331 scopus 로고    scopus 로고
    • Continuous space language models
    • Holger Schwenk. 2007. Continuous space language models. Computer Speech and Language, 21:492-518.
    • (2007) Computer Speech and Language , vol.21 , pp. 492-518
    • Schwenk, H.1
  • 25
    • 85044798389 scopus 로고    scopus 로고
    • Continuous space language models for statistical machine translation
    • Holger Schwenk. 2010. Continuous space language models for statistical machine translation. The Prague Bulletin of Mathematical Linguistics, (93):137-146.
    • (2010) The Prague Bulletin of Mathematical Linguistics , Issue.93 , pp. 137-146
    • Schwenk, H.1
  • 26
    • 80053381883 scopus 로고    scopus 로고
    • Smoothed bloom filter language models: Tera-scale lms on the cheap
    • David Talbot and Miles Osborne. 2007. Smoothed bloom filter language models: Tera-scale lms on the cheap. In EMNLP, pages 468-476.
    • (2007) EMNLP , pp. 468-476
    • Talbot, D.1    Osborne, M.2
  • 27
    • 80053246370 scopus 로고    scopus 로고
    • Efficient subsampling for training complex language models
    • Puyang Xu, Asela Gunawardana, and Sanjeev Khudanpur. 2011. Efficient subsampling for training complex language models. In EMNLP, pages 1128-1136.
    • (2011) EMNLP , pp. 1128-1136
    • Xu, P.1    Gunawardana, A.2    Khudanpur, S.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.