메뉴 건너뛰기




Volumn 1, Issue , 2006, Pages 225-232

Approximation lasso methods for language modeling

Author keywords

[No Author keywords available]

Indexed keywords

COMPUTATIONAL LINGUISTICS; MAXIMUM LIKELIHOOD ESTIMATION; NATURAL LANGUAGE PROCESSING SYSTEMS;

EID: 34548071816     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.3115/1220175.1220204     Document Type: Conference Paper
Times cited : (10)

References (18)
  • 1
    • 78049406407 scopus 로고    scopus 로고
    • Language model adaptation with MAP estimation and the perceptron algorithm
    • Bacchiani, M., Roark, B., and Saraclar, M. 2004. Language model adaptation with MAP estimation and the perceptron algorithm. In HLT-NAACL 2004. 21-24.
    • (2004) HLT-NAACL 2004 , pp. 21-24
    • Bacchiani, M.1    Roark, B.2    Saraclar, M.3
  • 2
    • 33646407289 scopus 로고    scopus 로고
    • Discriminative reranking for natural language parsing
    • Collins, Michael and Terry Koo 2005. Discriminative reranking for natural language parsing. Computational Linguistics 31(1): 25-69.
    • (2005) Computational Linguistics , vol.31 , Issue.1 , pp. 25-69
    • Collins, M.1    Koo, T.2
  • 6
    • 0842290958 scopus 로고    scopus 로고
    • An efficient boosting algorithm for combining preferences
    • Freund, Y, R. Iyer, R. E. Schapire, and Y. Singer. 1998. An efficient boosting algorithm for combining preferences. In ICML'98.
    • (1998) ICML'98
    • Freund, Y.1    Iyer, R.2    Schapire, R.E.3    Singer, Y.4
  • 8
    • 10044294687 scopus 로고    scopus 로고
    • Exploiting headword dependency and predictive clustering for language modeling
    • Gao, Jianfeng, Hisami Suzuki and Yang Wen. 2002. Exploiting headword dependency and predictive clustering for language modeling. In EMNLP 2002.
    • (2002) EMNLP 2002
    • Gao, J.1    Suzuki, H.2    Wen, Y.3
  • 9
    • 33846627491 scopus 로고    scopus 로고
    • Minimum sample risk methods for language modeling
    • Gao. J., Yu, H., Yuan, W., and Xu, P. 2005. Minimum sample risk methods for language modeling. In HLT/EMNLP 2005.
    • (2005) HLT/EMNLP 2005
    • Gao, J.1    Yu, H.2    Yuan, W.3    Xu, P.4
  • 12
    • 4544269058 scopus 로고    scopus 로고
    • Corrective language modeling for large vocabulary ASR with the perceptron algorithm
    • Roark, Brian, Murat Saraclar and Michael Collins. 2004. Corrective language modeling for large vocabulary ASR with the perceptron algorithm. In ICASSP 2004.
    • (2004) ICASSP 2004
    • Roark, B.1    Saraclar, M.2    Collins, M.3
  • 13
    • 0033281701 scopus 로고    scopus 로고
    • Improved boosting algorithms using confidence-rated predictions
    • Schapire, Robert E. and Yoram Singer. 1999. Improved boosting algorithms using confidence-rated predictions. Machine Learning, 37(3): 297-336.
    • (1999) Machine Learning , vol.37 , Issue.3 , pp. 297-336
    • Schapire, R.E.1    Singer, Y.2
  • 14
    • 80053285293 scopus 로고    scopus 로고
    • A comparative study on language model adaptation using new evaluation metrics
    • Suzuki, Hisami and Jianfeng Gao. 2005. A comparative study on language model adaptation using new evaluation metrics. In HLT/EMNLP 2005.
    • (2005) HLT/EMNLP 2005
    • Suzuki, H.1    Gao, J.2
  • 15
    • 0001287271 scopus 로고    scopus 로고
    • Regression shrinkage and selection via the lasso
    • Tibshirani, R. 1996. Regression shrinkage and selection via the lasso. J. R. Statist. Soc. B, 58(1): 267-288.
    • (1996) J. R. Statist. Soc. B , vol.58 , Issue.1 , pp. 267-288
    • Tibshirani, R.1
  • 16
    • 84860541778 scopus 로고    scopus 로고
    • An empirical study on language model adaptation using a metric of domain similarity
    • Yuan, W., J. Gao and H. Suzuki. 2005. An Empirical Study on Language Model Adaptation Using a Metric of Domain Similarity. In IJCNLP 05.
    • (2005) IJCNLP 05
    • Yuan, W.1    Gao, J.2    Suzuki, H.3
  • 18


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.