DREC: Towards a datasheet for reporting experiments in crowdsourcing

Jorge Ramírez, Marcos Baez, Fabio Casati, Luca Cernuzzi, Boualem Benatallah

Результат исследований: Материалы для книги/типы отчетовМатериалы для конференции

1 Цитирования (Scopus)

Аннотация

Factors such as instructions, payment schemes, platform demographics, along with strategies for mapping studies into crowdsourcing environments, play an important role in the reproducibility of results. However, inferring these details from scientific articles is often a challenging endeavor, calling for the development of proper reporting guidelines. This paper makes the first steps towards this goal, by describing an initial taxonomy of relevant attributes for crowdsourcing experiments, and providing a glimpse into the state of reporting by analyzing a sample of CSCW papers.

Язык оригиналаАнглийский
Название основной публикацииCSCW 2020 Companion - Conference Companion Publication of the 2020 Computer Supported Cooperative Work and Social Computing
ИздательAssociation for Computing Machinery
Страницы377-382
Число страниц6
ISBN (электронное издание)9781450380591
DOI
СостояниеОпубликовано - 17 окт 2020
Событие3rd ACM Conference on Computer-Supported Cooperative Work and Social Computing, CSCW 2020 - Virtual, Online, Соединенные Штаты Америки
Продолжительность: 17 окт 202021 окт 2020

Серия публикаций

НазваниеProceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW

Конференция

Конференция3rd ACM Conference on Computer-Supported Cooperative Work and Social Computing, CSCW 2020
СтранаСоединенные Штаты Америки
ГородVirtual, Online
Период17.10.2021.10.20

ASJC Scopus subject areas

  • Software
  • Computer Networks and Communications
  • Human-Computer Interaction

Fingerprint Подробные сведения о темах исследования «DREC: Towards a datasheet for reporting experiments in crowdsourcing». Вместе они формируют уникальный семантический отпечаток (fingerprint).

Цитировать