메뉴 건너뛰기




Volumn , Issue , 2012, Pages 2835-2850

Factored language model based on recurrent neural network

Author keywords

Factored languagemodel; Large vocabulary continuous speech recognition; Recurrent neural network

Indexed keywords

BETTER PERFORMANCE; FACTORED LANGUAGEMODEL; LANGUAGE MODEL; LARGE VOCABULARY CONTINUOUS SPEECH RECOGNITION; LINGUISTIC INFORMATION; NETWORK-BASED; TREEBANKS; WORD ERROR RATE;

EID: 84876802979     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (27)

References (32)
  • 8
    • 85024115120 scopus 로고    scopus 로고
    • An empirical study of smoothing techniques for language modeling
    • California, USA
    • Chen, S. F. and Goodman, J. T. (1996). An empirical study of smoothing techniques for language modeling. In Proceedings of ACL 1996, pages 310-318, California, USA.
    • (1996) Proceedings of ACL 1996 , pp. 310-318
    • Chen, S.F.1    Goodman, J.T.2
  • 9
    • 80053411091 scopus 로고    scopus 로고
    • Discriminative syntactic language modeling for speech recognition
    • Michigan, USA
    • Collins, M., Roark, B., and Saraclar, M. (2005). Discriminative syntactic language modeling for speech recognition. In Proceedings of ACL 2005, pages 507-514, Michigan, USA.
    • (2005) Proceedings of ACL 2005 , pp. 507-514
    • Collins, M.1    Roark, B.2    Saraclar, M.3
  • 11
    • 4544239594 scopus 로고    scopus 로고
    • Exact training of a neural syntactic language model
    • Montreal, Canada
    • Emami, A. and Jelinek, F. (2004). Exact training of a neural syntactic language model. In Proceedings of ICASSP 2004, pages 245-248, Montreal, Canada.
    • (2004) Proceedings of ICASSP 2004 , pp. 245-248
    • Emami, A.1    Jelinek, F.2
  • 14
    • 0034856455 scopus 로고    scopus 로고
    • Classes for fast maximum entropy training
    • Utah, USA
    • Goodman, J. (2001). Classes for fast maximum entropy training. In Proceedings of ICASSP 2001, Utah, USA.
    • (2001) Proceedings of ICASSP 2001
    • Goodman, J.1
  • 15
    • 80053341989 scopus 로고    scopus 로고
    • Style and topic language model adaptation using hmm-lda
    • Sydney, Australia
    • Hsu, B.-J. P. and Glass, J. (2006). Style and topic language model adaptation using hmm-lda. In Proceedings of EMNLP 2006, pages 373-381, Sydney, Australia.
    • (2006) Proceedings of EMNLP 2006 , pp. 373-381
    • Hsu, B.-J.P.1    Glass, J.2
  • 16
    • 0034297742 scopus 로고    scopus 로고
    • Maximumentropy techniques for exploiting syntactic, semantic and collocational dependencies in language modeling
    • Khudanpur, S. and Wu, J. (2000). Maximumentropy techniques for exploiting syntactic, semantic and collocational dependencies in language modeling. Computer Speech and Language, pages 355-372.
    • (2000) Computer Speech and Language , pp. 355-372
    • Khudanpur, S.1    Wu, J.2
  • 17
    • 80053346232 scopus 로고    scopus 로고
    • Factored translation models
    • Prague, Czech Republic
    • Koehn, P. and Hoang, H. (2007). Factored translation models. In Proceedings of EMNLP 2007, pages 868-876, Prague, Czech Republic.
    • (2007) Proceedings of EMNLP 2007 , pp. 868-876
    • Koehn, P.1    Hoang, H.2
  • 20
    • 84865803833 scopus 로고    scopus 로고
    • Empirical evaluation and combination of advanced language modeling techniques
    • Florence, Italy
    • Mikolov, T., Anoop, D., Stefan, K., Burget, L., and Cernocky, J. (2011a). Empirical evaluation and combination of advanced language modeling techniques. In Proceedings of INTERSPEECH 2011, pages 605-608, Florence, Italy.
    • (2011) Proceedings of INTERSPEECH 2011 , pp. 605-608
    • Mikolov, T.1    Anoop, D.2    Stefan, K.3    Burget, L.4    Cernocky, J.5
  • 25
    • 0030181951 scopus 로고    scopus 로고
    • A maximum entropy approach to adaptive statistical language modeling
    • Rosenfeld, R. (1996). A maximum entropy approach to adaptive statistical language modeling. Computational Linguistics, 10(3):187-228.
    • (1996) Computational Linguistics , vol.10 , Issue.3 , pp. 187-228
    • Rosenfeld, R.1
  • 26
    • 33847610331 scopus 로고    scopus 로고
    • Continuous space language models
    • Schwenk, H. (2007). Continuous space language models. Computer Speech and Language, 21(3):492-518.
    • (2007) Computer Speech and Language , vol.21 , Issue.3 , pp. 492-518
    • Schwenk, H.1
  • 28
    • 84858976070 scopus 로고    scopus 로고
    • Feature engineering in context-dependentdeep neural networks for conversational speech transcription
    • Hawaii, USA
    • Seide, F., Li, G., Chen, X., and Yu, D. (2011). Feature engineering in context-dependentdeep neural networks for conversational speech transcription. In Proceedings of ASRU 2011, Hawaii, USA.
    • (2011) Proceedings of ASRU 2011
    • Seide, F.1    Li, G.2    Chen, X.3    Yu, D.4
  • 29
    • 84878413383 scopus 로고    scopus 로고
    • Towards recurrent neural networks language models with linguistic and contextual features
    • Shi, Y., Wiggers, P., and Catholijn, M. J. (2012). Towards recurrent neural networks language models with linguistic and contextual features. In Proceedings of Interspeech 2012.
    • (2012) Proceedings of Interspeech 2012
    • Shi, Y.1    Wiggers, P.2    Catholijn, M.J.3
  • 30
    • 84891308106 scopus 로고    scopus 로고
    • Srilm - An extensible language modeling toolkit
    • Colorado, USA
    • Stolcke, A. (2002). Srilm - an extensible language modeling toolkit. In Proceedings of INTERSPEECH 2002, pages 901-904, Colorado, USA.
    • (2002) Proceedings of INTERSPEECH 2002 , pp. 901-904
    • Stolcke, A.1
  • 32
    • 85117170162 scopus 로고    scopus 로고
    • Random forests in language modeling
    • Barcelona, Spain
    • Xu, P. and Jelinek, F. (2004). Random forests in language modeling. In Proceedings of EMNLP 2004, pages 325-332, Barcelona, Spain.
    • (2004) Proceedings of EMNLP 2004 , pp. 325-332
    • Xu, P.1    Jelinek, F.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.