Crowd-based multi-predicate screening of papers in literature reviews

Evgeny Krivosheev, Fabio Casati, Boualem Benatallah

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

Systematic literature reviews (SLRs) are one of the most common and useful form of scientific research and publication. Tens of thousands of SLRs are published each year, and this rate is growing across all fields of science. Performing an accurate, complete and unbiased SLR is however a difficult and expensive endeavor. This is true in general for all phases of a literature review, and in particular for the paper screening phase, where authors filter a set of potentially in-scope papers based on a number of exclusion criteria. To address the problem, in recent years the research community has began to explore the use of the crowd to allow for a faster, accurate, cheaper and unbiased screening of papers. Initial results show that crowdsourcing can be effective, even for relatively complex reviews. In this paper we derive and analyze a set of strategies for crowd-based screening, and show that an adaptive strategy, that continuously re-assesses the statistical properties of the problem to minimize the number of votes needed to take decisions for each paper, significantly outperforms a number of non-adaptive approaches in terms of cost and accuracy. We validate both applicability and results of the approach through a set of crowdsourcing experiments, and discuss properties of the problem and algorithms that we believe to be generally of interest for classification problems where items are classified via a series of successive tests (as it often happens in medicine).

Original languageEnglish
Title of host publicationThe Web Conference 2018 - Proceedings of the World Wide Web Conference, WWW 2018
PublisherAssociation for Computing Machinery, Inc
Pages55-64
Number of pages10
ISBN (Electronic)9781450356398
DOIs
Publication statusPublished - 10 Apr 2018
Event27th International World Wide Web, WWW 2018 - Lyon, France
Duration: 23 Apr 201827 Apr 2018

Publication series

NameThe Web Conference 2018 - Proceedings of the World Wide Web Conference, WWW 2018

Conference

Conference27th International World Wide Web, WWW 2018
CountryFrance
CityLyon
Period23.4.1827.4.18

Keywords

  • Crowdsourcing
  • Human computation
  • Literature reviews

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Software

Fingerprint Dive into the research topics of 'Crowd-based multi-predicate screening of papers in literature reviews'. Together they form a unique fingerprint.

  • Cite this

    Krivosheev, E., Casati, F., & Benatallah, B. (2018). Crowd-based multi-predicate screening of papers in literature reviews. In The Web Conference 2018 - Proceedings of the World Wide Web Conference, WWW 2018 (pp. 55-64). (The Web Conference 2018 - Proceedings of the World Wide Web Conference, WWW 2018). Association for Computing Machinery, Inc. https://doi.org/10.1145/3178876.3186036