Crowdsourced evaluation is a promising method of evaluating engineering design attributes that require human input. The challenge is to correctly estimate scores using a massive and diverse crowd, particularly when only a small subset of evaluators has the expertise to give correct evaluations. Since averaging evaluations across all evaluators will result in an inaccurate crowd evaluation, this paper benchmarks a crowd consensus model that aims to identify experts such that their evaluations may be given more weight. Simulation results indicate this crowd consensus model outperforms averaging when it correctly identifies experts in the crowd, under the assumption that only experts have consistent evaluations. However, empirical results from a real human crowd indicate this assumption may not hold even on a simple engineering design evaluation task, as clusters of consistently wrong evaluators are shown to exist along with the cluster of experts. This suggests that both averaging evaluations and a crowd consensus model that relies only on evaluations may not be adequate for engineering design tasks, accordingly calling for further research into methods of finding experts within the crowd.
Skip Nav Destination
Article navigation
March 2015
Research-Article
When Crowdsourcing Fails: A Study of Expertise on Crowdsourced Design Evaluation
Yi Ren,
Yi Ren
Research Fellow
Department of Mechanical Engineering,
e-mail: [email protected]
Department of Mechanical Engineering,
University of Michigan
,Ann Arbor, MI 48109
e-mail: [email protected]
Search for other works by this author on:
Giannis Papazoglou,
Giannis Papazoglou
Department of Mechanical Engineering,
e-mail: [email protected]
Cyprus University of Technology
,Limassol 3036, Cyprus
e-mail: [email protected]
Search for other works by this author on:
Richard Gonzalez,
Richard Gonzalez
Professor
Department of Psychology,
e-mail: [email protected]
Department of Psychology,
University of Michigan
,Ann Arbor, MI 48109
e-mail: [email protected]
Search for other works by this author on:
Panos Y. Papalambros
Panos Y. Papalambros
Professor
Fellow ASME
Department of Mechanical Engineering,
e-mail: [email protected]
Fellow ASME
Department of Mechanical Engineering,
University of Michigan
,Ann Arbor, MI 48109
e-mail: [email protected]
Search for other works by this author on:
Alex Burnap
Yi Ren
Research Fellow
Department of Mechanical Engineering,
e-mail: [email protected]
Department of Mechanical Engineering,
University of Michigan
,Ann Arbor, MI 48109
e-mail: [email protected]
Richard Gerth
Giannis Papazoglou
Department of Mechanical Engineering,
e-mail: [email protected]
Cyprus University of Technology
,Limassol 3036, Cyprus
e-mail: [email protected]
Richard Gonzalez
Professor
Department of Psychology,
e-mail: [email protected]
Department of Psychology,
University of Michigan
,Ann Arbor, MI 48109
e-mail: [email protected]
Panos Y. Papalambros
Professor
Fellow ASME
Department of Mechanical Engineering,
e-mail: [email protected]
Fellow ASME
Department of Mechanical Engineering,
University of Michigan
,Ann Arbor, MI 48109
e-mail: [email protected]
1Corresponding author.
Contributed by the Design Theory and Methodology Committee of ASME for publication in the JOURNAL OF MECHANICAL DESIGN. Manuscript received April 29, 2014; final manuscript received November 6, 2014; published online January 9, 2015. Assoc. Editor: Jonathan Cagan.
This material is declared a work of the U.S. Government and is not subject to copyright protection in the United States. Approved for public release; distribution is unlimited.
J. Mech. Des. Mar 2015, 137(3): 031101 (9 pages)
Published Online: March 1, 2015
Article history
Received:
April 29, 2014
Revision Received:
November 6, 2014
Online:
January 9, 2015
Citation
Burnap, A., Ren, Y., Gerth, R., Papazoglou, G., Gonzalez, R., and Papalambros, P. Y. (March 1, 2015). "When Crowdsourcing Fails: A Study of Expertise on Crowdsourced Design Evaluation." ASME. J. Mech. Des. March 2015; 137(3): 031101. https://doi.org/10.1115/1.4029065
Download citation file:
Get Email Alerts
Related Articles
Neuro-Cognitive Insights Into Engineering Design: Exploring Electroencephalography Predictive Associations With Task Performance
J. Mech. Des (May,2025)
Untrained and Unmatched: Fast and Accurate Zero-Training Classification for Tabular Engineering Data
J. Mech. Des (September,2024)
Applied Tests of Design Skills—Part III: Abstract Reasoning
J. Mech. Des (October,2014)
The Effects of Language and Pruning on Function Structure Interpretability
J. Mech. Des (June,2012)
Related Proceedings Papers
Related Chapters
Knowledge-Based Engineering
Computer Aided Design and Manufacturing
Platform Technologies
Computer Aided Design and Manufacturing
Utility Function Fundamentals
Decision Making in Engineering Design