DREC: Towards a datasheet for reporting experiments in crowdsourcing

Jorge Ramírez, Marcos Baez, Fabio Casati, Luca Cernuzzi, Boualem Benatallah

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Factors such as instructions, payment schemes, platform demographics, along with strategies for mapping studies into crowdsourcing environments, play an important role in the reproducibility of results. However, inferring these details from scientific articles is often a challenging endeavor, calling for the development of proper reporting guidelines. This paper makes the first steps towards this goal, by describing an initial taxonomy of relevant attributes for crowdsourcing experiments, and providing a glimpse into the state of reporting by analyzing a sample of CSCW papers.

Original languageEnglish
Title of host publicationCSCW 2020 Companion - Conference Companion Publication of the 2020 Computer Supported Cooperative Work and Social Computing
PublisherAssociation for Computing Machinery
Pages377-382
Number of pages6
ISBN (Electronic)9781450380591
DOIs
Publication statusPublished - 17 Oct 2020
Event3rd ACM Conference on Computer-Supported Cooperative Work and Social Computing, CSCW 2020 - Virtual, Online, United States
Duration: 17 Oct 202021 Oct 2020

Publication series

NameProceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW

Conference

Conference3rd ACM Conference on Computer-Supported Cooperative Work and Social Computing, CSCW 2020
CountryUnited States
CityVirtual, Online
Period17.10.2021.10.20

Keywords

  • Crowdsourcing
  • Crowdsourcing experiments
  • Guidelines for reporting experiments
  • Taxonomy

ASJC Scopus subject areas

  • Software
  • Computer Networks and Communications
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'DREC: Towards a datasheet for reporting experiments in crowdsourcing'. Together they form a unique fingerprint.

Cite this