Skip to main content
  • Book
  • © 2019

Information Retrieval Evaluation in a Changing World

Lessons Learned from 20 Years of CLEF

  • Provides an overview of 20 years of research in evaluating information retrieval systems
  • Presents summaries of the most important experimental results and findings concerning information retrieval in various types of media
  • Highlights the lessons learnt over the years, providing readers with useful guidelines on the best approaches and techniques

Part of the book series: The Information Retrieval Series (INRE, volume 41)

Buy it now

Buying options

eBook USD 99.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

This is a preview of subscription content, log in via an institution to check for access.

Table of contents (25 chapters)

  1. Front Matter

    Pages i-xxii
  2. Experimental Evaluation and CLEF

    1. Front Matter

      Pages 1-1
    2. The Evolution of Cranfield

      • Ellen M. Voorhees
      Pages 45-69
    3. How to Run an Evaluation Task

      • Tetsuya Sakai
      Pages 71-102
  3. Evaluation Infrastructures

    1. Front Matter

      Pages 103-103
    2. An Innovative Approach to Data Management and Curation of Experimental Data Generated Through IR Test Collections

      • Maristella Agosti, Giorgio Maria Di Nunzio, Nicola Ferro, Gianmaria Silvello
      Pages 105-122
    3. TIRA Integrated Research Architecture

      • Martin Potthast, Tim Gollub, Matti Wiegmann, Benno Stein
      Pages 123-160
  4. Multilingual and Multimedia Information Retrieval

    1. Front Matter

      Pages 175-175
    2. The Challenges of Language Variation in Information Access

      • Jussi Karlgren, Turid Hedlund, Kalervo Järvelin, Heikki Keskustalo, Kimmo Kettunen
      Pages 201-216
    3. Multi-Lingual Retrieval of Pictures in ImageCLEF

      • Paul Clough, Theodora Tsikrika
      Pages 217-230
    4. Experiences from the ImageCLEF Medical Retrieval and Annotation Tasks

      • Henning Müller, Jayashree Kalpathy-Cramer, Alba García Seco de Herrera
      Pages 231-250
    5. Automatic Image Annotation at ImageCLEF

      • Josiah Wang, Andrew Gilbert, Bart Thomee, Mauricio Villegas
      Pages 251-273
    6. Image Retrieval Evaluation in Specific Domains

      • Luca Piras, Barbara Caputo, Duc-Tien Dang-Nguyen, Michael Riegler, Pål Halvorsen
      Pages 275-305
  5. Retrieval in New Domains

    1. Front Matter

      Pages 331-331
    2. The Scholarly Impact and Strategic Intent of CLEF eHealth Labs from 2012 to 2017

      • Hanna Suominen, Liadh Kelly, Lorraine Goeuriot
      Pages 333-363
    3. Multilingual Patent Text Retrieval Evaluation: CLEF–IP

      • Florina Piroi, Allan Hanbury
      Pages 365-387

About this book

This volume celebrates the twentieth anniversary of CLEF - the Cross-Language Evaluation Forum for the first ten years, and the Conference and Labs of the Evaluation Forum since – and traces its evolution over these first two decades. CLEF’s main mission is to promote research, innovation and development of information retrieval (IR) systems by anticipating trends in information management in order to stimulate advances in the field of IR system experimentation and evaluation.


The book is divided into six parts. Parts I and II provide background and context, with the first part explaining what is meant by experimental evaluation and the underlying theory, and describing how this has been interpreted in CLEF and in other internationally recognized evaluation initiatives. Part II presents research architectures and infrastructures that have been developed to manage experimental data and to provide evaluation services in CLEF and elsewhere. Parts III, IV andV represent the core of the book, presenting some of the most significant evaluation activities in CLEF, ranging from the early multilingual text processing exercises to the later, more sophisticated experiments on multimodal collections in diverse genres and media. In all cases, the focus is not only on describing “what has been achieved”, but above all on “what has been learnt”. The final part examines the impact CLEF has had on the research world and discusses current and future challenges, both academic and industrial, including the relevance of IR benchmarking in industrial settings.


Mainly intended for researchers in academia and industry, it also offers useful insights and tips for practitioners in industry working on the evaluation and performance issues of IR tools, and graduate students specializing in information retrieval.

Editors and Affiliations

  • Dipartimento di Ingegneria dell’Informazione, Università degli Studi di Padova , Padova, Italy

    Nicola Ferro

  • Consiglio Nazionale delle Ricerche, Istituto di Scienza e Tecnologie dell’Informazione, Pisa, Italy

    Carol Peters

About the editors

Nicola Ferro is an Associate Professor of Computer Science at the University of Padua, Italy. His research interests include information retrieval, its experimental evaluation, multilingual information access and digital libraries. He is the coordinator of the CLEF evaluation initiative, which includes more than 200 research groups around the globe involved in large-scale IR evaluation activities. He was also the coordinator of the EU Seventh Framework Programme Network of Excellence PROMISE on information retrieval evaluation.


Carol Peters, now Research Associate, was a Researcher at the Italian National Research Council’s “Istituto di Scienza e Tecnologie dell'Informazione.” Her main research activities focused on the development of multilingual access mechanisms for digital libraries and evaluation methodologies for cross-language information retrieval systems. She was leader of the EU Sixth Framework MultiMatch project, and coordinated the Cross-Language Evaluation Forum (CLEF) during its first ten years of activity. In 2009, in recognition of her work for CLEF, she was awarded the Tony Kent Strix Award.

Bibliographic Information

  • Book Title: Information Retrieval Evaluation in a Changing World

  • Book Subtitle: Lessons Learned from 20 Years of CLEF

  • Editors: Nicola Ferro, Carol Peters

  • Series Title: The Information Retrieval Series

  • DOI: https://doi.org/10.1007/978-3-030-22948-1

  • Publisher: Springer Cham

  • eBook Packages: Computer Science, Computer Science (R0)

  • Copyright Information: Springer Nature Switzerland AG 2019

  • Hardcover ISBN: 978-3-030-22947-4Published: 26 August 2019

  • Softcover ISBN: 978-3-030-22950-4Published: 26 August 2020

  • eBook ISBN: 978-3-030-22948-1Published: 13 August 2019

  • Series ISSN: 1871-7500

  • Series E-ISSN: 2730-6836

  • Edition Number: 1

  • Number of Pages: XXII, 595

  • Number of Illustrations: 14 b/w illustrations, 75 illustrations in colour

  • Topics: Information Storage and Retrieval, Natural Language Processing (NLP)

Buy it now

Buying options

eBook USD 99.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access