Nothing Special   »   [go: up one dir, main page]

Skip to main content

Dublin City University at CLEF 2004: Experiments with the ImageCLEF St. Andrew’s Collection

  • Conference paper
Multilingual Information Access for Text, Speech and Images (CLEF 2004)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 3491))

Included in the following conference series:

  • 622 Accesses

Abstract

For the CLEF 2004 ImageCLEF St Andrew’s Collection task the Dublin City University group carried out three sets of experiments: standard cross-language information retrieval (CLIR) runs using topic translation via machine translation (MT), combination of this run with image matching results from the GIFT/Viper system, and a novel document rescoring approach based on automatic MT evaluation metrics. Our standard MT-based CLIR works well on this task. Encouragingly combination with image matching lists is also observed to produce small positive changes in the retrieval output. However, rescoring using the MT evaluation metrics in their current form significantly reduced retrieval effectiveness.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Lam-Adesina, A.M., Jones, G.J.F.: Exeter at CLEF 2003: Experiments with Machine Translation for Monolingual, Bilingual and Multilingual Retrieval. In: Peters, C., et al. (eds.) CLEF 2003. LNCS, vol. 3237, pp. 271–285. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  2. Lam-Adesina, A.M., Jones, G.J.F.: Applying Summarization Techniques for Term Selection in Relevance Feedback. In: Proceedings of the 24th Annual International ACM SIGIR, New Orleans, pp. 1–9. ACM, New York (2001)

    Google Scholar 

  3. Clough, P., Müller, H., Sanderson, M.: The CLEF Cross Language Image Retrieval Track (ImageCLEF) 2004. In: Peters, C., et al. (eds.) CLEF 2004. LNCS, vol. 3491, pp. 597–613. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  4. Jones, G.J.F., Lam-Adesina, A.M.: Exeter at CLEF 2001: Experiments with Machine Translation for Bilingual Retrieval. In: Peters, C., et al. (eds.) CLEF 2001. LNCS, vol. 2406, pp. 59–77. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  5. Papineni, K., Roukos, S., Ward, T., Zhu, W.-J.: Bleu: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics, Philadelphia, USA, pp. 311–318 (2002)

    Google Scholar 

  6. Doddington, G.: Automatic Evaluation of Machine Translation Quality Using N-gram Co-Occurrence Statistics. In: Human Language Technology: Notebook Proceedings, San Diego, pp. 128–132 (2002)

    Google Scholar 

  7. General Text Matcher, http://nlp.cs.nyu.edu/GTM/

  8. NIST’s MT Evaluation Toolkit, http://www.nist.gov/speech/tests/mt/resources/scoring.htm

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Jones, G.J.F., Groves, D., Khasin, A., Lam-Adesina, A., Mellebeek, B., Way, A. (2005). Dublin City University at CLEF 2004: Experiments with the ImageCLEF St. Andrew’s Collection. In: Peters, C., Clough, P., Gonzalo, J., Jones, G.J.F., Kluck, M., Magnini, B. (eds) Multilingual Information Access for Text, Speech and Images. CLEF 2004. Lecture Notes in Computer Science, vol 3491. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11519645_64

Download citation

  • DOI: https://doi.org/10.1007/11519645_64

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-27420-9

  • Online ISBN: 978-3-540-32051-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics