메뉴 건너뛰기




Volumn , Issue , 2011, Pages 605-608

Empirical evaluation and combination of advanced language modeling techniques

Author keywords

Language modeling; Model combination; Neural networks; Speech recognition

Indexed keywords

CLASS-BASED; EMPIRICAL EVALUATIONS; GOOD-TURING; INDIVIDUAL MODELS; LANGUAGE MODEL; LANGUAGE MODELING; LINEAR INTERPOLATION; MAXIMUM ENTROPY MODELS; MODEL COMBINATION; RANDOM FORESTS; STATE OF THE ART; STRUCTURED LANGUAGE; TRI GRAMS;

EID: 84865803833     PISSN: None     EISSN: 19909772     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (282)

References (20)
  • 7
    • 79959853437 scopus 로고    scopus 로고
    • A joint language model with fine-grain syntactic tags
    • Denis Filimonov and Mary Harper. A joint language model with fine-grain syntactic tags. EMNLP 2009.
    • (2009) EMNLP
    • Filimonov, D.1    Harper, M.2
  • 8
    • 4544239594 scopus 로고    scopus 로고
    • Exact training of a neural syntactic language model
    • Ahmad Emami, Frederick Jelinek. Exact training of a neural syntactic language model. In: Proc. ICASSP, 2004.
    • (2004) Proc. ICASSP
    • Emami, A.1    Jelinek, F.2
  • 13
    • 79959828555 scopus 로고    scopus 로고
    • Efficient Estimation of Maximum Entropy Language Models with N-gram features: An SRILM extension
    • Tanel Alumäe and Mikko Kurimo. Efficient Estimation of Maximum Entropy Language Models with N-gram features: an SRILM extension, In: Proc. INTERSPEECH 2010.
    • Proc. INTERSPEECH 2010
    • Alumäe, T.1    Kurimo, M.2
  • 14
    • 77949367510 scopus 로고    scopus 로고
    • Self-supervised discriminative training of statistical language models
    • Puyang Xu, D. Karakos, S. Khudanpur. Self-Supervised Discriminative Training of Statistical Language Models. ASRU 2009.
    • (2009) ASRU
    • Xu, P.1    Karakos, D.2    Khudanpur, S.3
  • 17
    • 4544358964 scopus 로고    scopus 로고
    • The SuperARV language model: Investigating the effectiveness of tightly integrating multiple knowledge sources
    • W.Wang and M. Harper. The SuperARV language model: Investigating the effectiveness of tightly integrating multiple knowledge sources. In: Proc. EMNLP, 2002.
    • (2002) Proc. EMNLP
    • Wang, W.1    Harper, M.2
  • 20
    • 0022471098 scopus 로고
    • Learning internal representations by back-propagating errors
    • D. E. Rumelhart, G. E. Hinton, R. J. Williams. 1986. Learning internal representations by back-propagating errors. Nature, 323, 533-536.
    • (1986) Nature , vol.323 , pp. 533-536
    • Rumelhart, D.E.1    Hinton, G.E.2    Williams, R.J.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.