Predicting Job Performance

Paul Barrett, October 2022


If you would like to be notified when new content is posted to our website, Subscribe to our email alerts.

Subscribe Now

The now famous article by Schmidt, F.L., & Hunter, J.E. (1998). The Validity and Utility of Selection Methods in Personnel Psychology: practical and theoretical implications of 85 years of research findings. Psychological Bulletin 124, 2, 262-274; made available a list of meta-analytic validity coefficients of predictors of job performance which have been, and still are, regarded by many as the de facto values for the workplace.

However, in 2013, Frank Schmidt, In-Sue Oh, and Jonathan Shaffer presented some updated validities in a talk presented at a Personnel Testing Council Metropolitan Washington (PCTMU) meeting: Schmidt, F.L., Oh, I-S., & Shaffer, J. (2013) The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 95 years of research findings; An update , pp 1-9 (open-access).

Then, in 2016, the 2013 conference paper was augmented substantially and published as a major preprint article on the Social Science Research Network (SSRN) site, It was eventually taken down off the SSRN website as the authors sought formal publication in a journal. The link here is to a pdf copy of the article on the University of Baltimore website (open-access). Schmidt, F.L., Oh, I-S., & Shaffer, J.A. (2016). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 100 years of research findings , pp. 1-74.

As far as I'm aware (having exchanged emails with In-Sue Oh), the two remaining authors are still trying to get the paper published (sadly, Frank Schmidt passed away in August 2021), after the initial 2016 submission attracted limited interest. Since 2020, the paper has been in the process of a Revise and Resubmit procedure with a major psychology journal.

I have included its indices here as some of its revised estimates are astounding (e.g. unstructured employment interviews have the same validity as structured ones; although querying this with In-Sue Oh, it would appear the studies used for the meta-analysis of unstructured interviews were actually structured to some degree, although not to the more organized levels of the structured interview studies).

Finally, a more recent article is "in Press" in an APA journal: Sackett, P.R., Zhang, C., Berry, C.M., & Lievens, F. (2022). Revisiting meta-analytic estimates of validity in personnel selection: Addressing systematic overcorrection for restriction of range . Journal of Applied Psychology, Advance online publication, pp. 1-104. This paper recalculates the Schmidt and Hunter (1998) validities based upon a revision of the estimates used to correct for restriction of range in the original paper. "We conclude that our selection procedures remain useful, but selection predictor-criterion relationships are considerably lower than previously thought." .

The figure below presents the validity estimates from the three papers introduced above. It is a dynamic graph in that if you hover over any legend entry atthe top of the graph, it will highlight the bars for that publication.

If you click on any legend entry, it will toggle the removal or re-display of the bars for that publication, allowing you to more easily make pairwise comparisons of validity estimates.

Page 2 of the figure provides the table of all the validity coefficients from each article, as plotted in the figure.

From these data, it is no longer clear whether Big Five personality attributes are even worth assessing via self-report, contextualised or otherwise, such is their explanatory inaccuracy. However, as shown in: Oh, I-S., Wang, G., & Mount, M.K. (2011). Validity of observer ratings of the five-factor model of personality traits: A meta-analysis . Journal of Applied Psychology, 96, 4, 762-773, Big Five validities computed using single-rater observed ratings do exceed self-report questionnaire validities.

As to GMA, its 1998 validity has almost halved given the more recent article from Sackett et al (2022). I have said more on this issue in a recent Cognadev blog . And, an earlier article: Richardson, K., & Norgate, S.H. (2015). Does IQ Really Predict Job Performance? Applied Developmental Science, 19, 3, 153-169. (open-access) had already questioned the accuracy of the earlier meta-analytic validity estimates in this area.

Anyway, I'm sure controversy will remain as to the wisdom or otherwise of correcting correlations for attenuation due to a variety of factors, but the graph above does at least present the competing evidence to date.