Call for Papers

Important Dates
  • Submission of abstracts / intentions: 20 April 2018
  • Submission of Long Papers: 7 May 2018
  • Submission of Short Papers: 14 May 2018
  • Notification of Acceptance: 8 June 2018
  • Camera Ready Copy due: 22 June 2018
  • Conference: 10-14 September 2018

The CLEF Conference addresses all aspects of Information Access in any modality and language. The CLEF conference includes presentation of research papers and a series of workshops presenting the results of lab-based comparative evaluation benchmarks. CLEF 2018 Avignon is the 9th year of the CLEF Conference series and the 19th year of the CLEF initiative as a forum for information retrieval (IR) evaluation. The CLEF conference has a clear focus on experimental IR as carried out within evaluation forums (CLEF Labs, TREC, NTCIR, FIRE, MediaEval, RomIP, SemEval, TAC, ...) with special attention to the challenges of multimodality, multilinguality, and interactive search also considering specific classes of users as children, students, impaired users in different tasks (academic, professional, …) . We invite paper submissions on significant new insights demonstrated on IR test collections, on analysis of IR test collections and evaluation measures, as well as on concrete proposals to push the boundaries of the Cranfield/TREC/CLEF evaluation paradigm.
The CLEF Conference addresses all aspects of Information Access in any modality and language. The CLEF conference includes presentation of research papers and a series of workshops presenting the results of lab-based comparative evaluation benchmarks. CLEF 2018 Avignon is the 9th year of the CLEF Conference series and the 19th year of the CLEF initiative as a forum for information retrieval (IR) evaluation. The CLEF conference has a clear focus on experimental IR as carried out within evaluation forums (CLEF Labs, TREC, NTCIR, FIRE, MediaEval, RomIP, SemEval, TAC, ...) with special attention to the challenges of multimodality, multilinguality, and interactive search also considering specific classes of users as children, students, impaired users in different tasks (academic, professional, …) . We invite paper submissions on significant new insights demonstrated on IR test collections, on analysis of IR test collections and evaluation measures, as well as on concrete proposals to push the boundaries of the Cranfield/TREC/CLEF evaluation paradigm.

Committee

Conference Chairs:

Patrice Bellot (Aix-Marseille Univ., France) Chiraz Trabelsi (Univ. of Tunis, Tunis)

Program Chairs:

Josiane Mothe (Univ. de Toulouse, France) Fionn Murtagh (Univ. of Huddersfield, UK)

Evaluation Lab Chairs

Jian Yun Nie (Univ. de Montréal, Canada) Laure Soulier (LIP6, UPMC, France)

Proceedings Chairs:

Linda Cappellato (Univ. of Padua, Italy) Nicola Ferro (University of Padua, Italy)

Local organizers:

Eric SanJuan (LIA, UAPV, France)

Publicity Chair:

Adrian Chifu (Aix-Marseille Université - CNRS LSIS, France)

Science Outreach Program Chair:

Aurelia Barriere (UAPV, FR)

Topics

Relevant topics for the CLEF 2018 Conference include but are not limited to:
  • Information Access in any Language or Modality: information retrieval, image retrieval, question answering, search interfaces and design, infrastructures, etc.
  • Analytics for Information Retrieval: theoretical and practical results in the analytics field that are specifically targeted for information access data analysis, data enrichment, etc.
  • User studies either based on lab studies or crowdsourcing.
  • Past results/run deep analysis both statistically and fine grain based.
  • Evaluation Initiatives: conclusions, lessons learned, impact and projection of any evaluation initiative after completing their cycle.
  • Evaluation: methodologies, metrics, statistical and analytical tools, component based, user groups and use cases, ground-truth creation, impact of multilingual/multicultural/multimodal differences, etc.
  • Technology Transfer: economic impact/sustainability of information access approaches, deployment and exploitation of systems, use cases, etc.
  • Interactive Information Retrieval Evaluation: the interactive evaluation of information retrieval systems using user-centered methods, evaluation of novel search interfaces, novel interactive evaluation methods, simulation of interaction, etc.
  • Specific Application Domains: Information access and its evaluation in application domains such as cultural heritage, digital libraries, social media, expert search, health information, legal documents, patents, news, books, plants, etc.
  • New data collection: presentation of new data collection with potential high impact on future research, specific collections from companies or labs, multilingual collections.
  • Work on data from rare languages, collaborative, social data.

Format

Authors are invited to electronically submit original papers, which have not been published and are not under consideration elsewhere, using the LNCS proceedings format:
http://www.springer.com/it/computer-science/lncs/conference-proceedings-guidelines Two types of papers are solicited:
  • Long papers: 12 pages max. Aimed to report complete research works.
  • Short papers: 6 pages max. Position papers, new evaluation proposals, developments and applications, etc.
Papers will be peer-reviewed by at least 3 members of the program committee. Selection will be based on originality, clarity, and technical quality. Papers should be submitted in PDF format to the following address: https://www.easychair.org/conferences/?conf=clef2018