Ayuda
Ir al contenido

Dialnet


Identifying experts in the crowd for evaluation of engineering designs

    1. [1] University of Michiga (USA)
  • Localización: Journal of Engineering Design, ISSN 0954-4828, Vol. 28, Nº. 5, 2017, págs. 317-337
  • Idioma: inglés
  • Texto completo no disponible (Saber más ...)
  • Resumen
    • Crowdsourcing offers the opportunity to gather evaluations on concept designs from evaluators that otherwise may not have been considered, thus leveraging additional expertise to improve decision making during early stages of the design process. Previous research has shown that crowdsourcing may fail to evaluate correctly even ‘simple’ engineering design concepts, because non-expert evaluations overwhelm the entire crowd evaluation. This article proposes using expertise prediction heuristics to automatically identify experts and filter non-experts prior to a crowdsourced evaluation. We conducted an experiment to test four common expertise prediction heuristics: (1) evaluator demographics, (2) evaluation reaction time, (3) mechanical reasoning aptitude, and (4) ‘easy and known’ versions of the actual ‘difficult and unknown’ design evaluation task. The results show statistical significance between variables for all four heuristics; however, most predictive power is garnered going from easy to difficult tasks. A combination of these heuristics offers a practical way to identify and filter experts from non-experts, thus improving crowdsourced evaluation of early-stage design concepts.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus

Opciones de compartir

Opciones de entorno