TY - JOUR
T1 - Who Is a Better Decision Maker? Data-Driven Expert Ranking Under Unobserved Quality
AU - Geva, Tomer
AU - Saar-Tsechansky, Maytal
N1 - Publisher Copyright:
© 2020 Production and Operations Management Society
PY - 2021/1
Y1 - 2021/1
N2 - The capacity to rank expert workers by their decision quality is a key managerial task of substantial significance to business operations. However, when no ground truth information is available on experts’ decisions, the evaluation of expert workers typically requires enlisting peer-experts, and this form of evaluation is prohibitively costly in many important settings. In this work, we develop a data-driven approach for producing effective rankings based on the decision quality of expert workers; our approach leverages historical data on past decisions, which are commonly available in organizational information systems. Specifically, we first formulate a new business data science problem: Ranking Expert decision makers’ unobserved decision Quality (REQ) using only historical decision data and excluding evaluation by peer experts. The REQ problem is challenging because the correct decisions in our settings are unknown (unobserved) and because some of the information used by decision makers might not be available for retrospective evaluation. To address the REQ problem, we develop a machine-learning–based approach and analytically and empirically explore conditions under which our approach is advantageous. Our empirical results over diverse settings and datasets show that our method yields robust performance: Its rankings of expert workers are consistently either superior or at least comparable to those obtained by the best alternative approach. Accordingly, our method constitutes a de facto benchmark for future research on the REQ problem.
AB - The capacity to rank expert workers by their decision quality is a key managerial task of substantial significance to business operations. However, when no ground truth information is available on experts’ decisions, the evaluation of expert workers typically requires enlisting peer-experts, and this form of evaluation is prohibitively costly in many important settings. In this work, we develop a data-driven approach for producing effective rankings based on the decision quality of expert workers; our approach leverages historical data on past decisions, which are commonly available in organizational information systems. Specifically, we first formulate a new business data science problem: Ranking Expert decision makers’ unobserved decision Quality (REQ) using only historical decision data and excluding evaluation by peer experts. The REQ problem is challenging because the correct decisions in our settings are unknown (unobserved) and because some of the information used by decision makers might not be available for retrospective evaluation. To address the REQ problem, we develop a machine-learning–based approach and analytically and empirically explore conditions under which our approach is advantageous. Our empirical results over diverse settings and datasets show that our method yields robust performance: Its rankings of expert workers are consistently either superior or at least comparable to those obtained by the best alternative approach. Accordingly, our method constitutes a de facto benchmark for future research on the REQ problem.
KW - data science
KW - decision quality evaluation
KW - label accuracy
KW - machine learning
KW - worker ranking
UR - http://www.scopus.com/inward/record.url?scp=85096681210&partnerID=8YFLogxK
U2 - 10.1111/poms.13260
DO - 10.1111/poms.13260
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85096681210
SN - 1059-1478
VL - 30
SP - 127
EP - 144
JO - Production and Operations Management
JF - Production and Operations Management
IS - 1
ER -