Testing for Specific Abilities Might be Better Predictor of Job Performance

New research raises questions about the relative value of tests used to pick the right person for the job.

A study by psychologists Phillip Ackerman, Ph.D., of the Georgia Institute of Technology, and Anna Cianciolo, Ph.D., of Yale University, suggests that testing for job or task content, not complexity, may be the key to predicting how different people will perform.

Tests that measure specific abilities - for example, well-defined verbal or spatial skills - required by a given job may predict performance better than tests that estimate some kind of general ability. These findings appear in the September issue of the Journal of Experimental Psychology: Applied, which is published by the American Psychological Association (APA).

Intrigued by conflicting data about effective predictors of task and job performance, Ackerman and Cianciolo tested 81 adults on cognitive skills such as pattern recognition, paper folding and other spatial abilities; numerical ability and problem solving; and memory and scanning tests. The psychologists also derived a general-ability score by combining spatial and numerical abilities. They trained the participants to use complex computer software, the TRACON (Terminal Radar Approach Control) air-traffic control training simulator similar to that used by the U.S. Federal Aviation Administration, Dept. of Defense and other organizations.

Ackerman and Cianciolo manipulated the TRACON software to see how well participants performed under changing conditions. They increased and decreased task complexity, such as the number of arrivals or overflights, and also varied its content, such as shifting from tasks requiring them to solve spatial problems to tasks calling for them to process perceptual input faster.

The researchers analyzed participants' TRACON use to see how well the abilities tests predicted performance in light of changing conditions. In the first major finding, specific cognitive abilities correlated with - and thus could be used to predict - task performance.

The second major finding, changing task complexity - a different aspect of job content - did not affect the relationships among abilities and performance. When the task was easier to perform, higher-ability participants held their advantage over lower-ability participants. As a result, ability tests were unable to forecast who might do better on more or less complex tasks.

"A focus on content abilities (such as numerical, spatial or verbal), rather than on general intelligence, may be more valuable in predicting performance on particular jobs," says Ackerman. "We can also generalize that particular ability tests may be valuable predictors of performance in jobs with different levels of complexity, such as numerical ability tests predicting success in both accounting and bookkeeping."

The results, although preliminary, suggest that test batteries used to select people for particular jobs or tasks might predict performance better if they focus less on estimating overall, general ability, and more on the content of the job - such as verbal abilities for writing, and spatial abilities for air-traffic control.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish