메뉴 건너뛰기




Volumn , Issue , 2006, Pages 460-466

Minority vote: At-least-N voting improves recall for extracting relations

Author keywords

[No Author keywords available]

Indexed keywords

CLASS LABELS; COREFERENCE; HIGH-PRECISION; NAMED ENTITIES; RELATION EXTRACTION;

EID: 85119972733     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (14)

References (18)
  • 3
    • 0030211964 scopus 로고    scopus 로고
    • Bagging predictors
    • L. Breiman. 1996. Bagging predictors. In Machine Learning, volume 24, page 123.
    • (1996) Machine Learning , vol.24 , pp. 123
    • Breiman, L.1
  • 4
    • 85113721711 scopus 로고    scopus 로고
    • Classifier combination for improved lexical disambiguation
    • August
    • E. Brill and J. Wu. 1998. Classifier combination for improved lexical disambiguation. Proceedings of COLING-ACL'98, pages 191-195, August.
    • (1998) Proceedings of COLING-ACL'98 , pp. 191-195
    • Brill, E.1    Wu, J.2
  • 5
    • 2142850338 scopus 로고    scopus 로고
    • Modeling consensus: Classifier combination for word sense disambiguation
    • Radu Florian and David Yarowsky. 2002. Modeling consensus: Classifier combination for word sense disambiguation. In Proceedings of EMNLP'02, pages 25-32.
    • (2002) Proceedings of EMNLP'02 , pp. 25-32
    • Florian, Radu1    Yarowsky, David2
  • 8
    • 33646051097 scopus 로고    scopus 로고
    • Exploiting diversity in natural language processing: Combining parsers
    • J. Henderson and E. Brill. 1999. Exploiting diversity in natural language processing: Combining parsers. In Proceedings on EMNLP99, pages 187-194.
    • (1999) Proceedings on EMNLP99 , pp. 187-194
    • Henderson, J.1    Brill, E.2
  • 10
    • 85140744814 scopus 로고    scopus 로고
    • Combining lexical, syntactic, and semantic features with maximum entropy models for information extraction
    • Barcelona, Spain, July. Association for Computational Linguistics
    • Nanda Kambhatla. 2004. Combining lexical, syntactic, and semantic features with maximum entropy models for information extraction. In The Proceedings of 42st Annual Meeting of the Association for Computational Linguistics, pages 178-181, Barcelona, Spain, July. Association for Computational Linguistics.
    • (2004) The Proceedings of 42st Annual Meeting of the Association for Computational Linguistics , pp. 178-181
    • Kambhatla, Nanda1
  • 12
    • 77951940145 scopus 로고    scopus 로고
    • NIST
    • NIST. 2004. The ACE evaluation plan. www.nist.gov/speech/tests/ace/index.htm.
    • (2004) The ACE evaluation plan
  • 13
    • 0032660854 scopus 로고    scopus 로고
    • Learning to parse natural language with maximum entropy models
    • Adwait Ratnaparkhi. 1999. Learning to parse natural language with maximum entropy models. Machine Learning, 34:151-178.
    • (1999) Machine Learning , vol.34 , pp. 151-178
    • Ratnaparkhi, Adwait1
  • 14
    • 0039891959 scopus 로고    scopus 로고
    • A machine learning approach to coreference resolution of noun phrases
    • W. M. Soon, H. T. Ng, and C. Y. Lim. 2001. A machine learning approach to coreference resolution of noun phrases. Computational Linguistics, 27(4):521-544.
    • (2001) Computational Linguistics , vol.27 , Issue.4 , pp. 521-544
    • Soon, W. M.1    Ng, H. T.2    Lim, C. Y.3
  • 17
    • 0026860706 scopus 로고
    • Methods of combining multiple classifiers and their applications to handwriting recognition
    • L. Xu, A. Krzyzak, and C. Suen. 1992. Methods of combining multiple classifiers and their applications to handwriting recognition. IEEE Trans. on Systems, Man. Cybernet, 22(3):418-435.
    • (1992) IEEE Trans. on Systems, Man. Cybernet , vol.22 , Issue.3 , pp. 418-435
    • Xu, L.1    Krzyzak, A.2    Suen, C.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.