Instructions Don’t Always Help Us To Do Better At A Task

By Emma Young

You might hate following instructions on how to do something, but there’s no avoiding them. Training on everything from how to drive a car to read an X-ray starts with explicit instructions — whether verbal or written, as the authors of a new paper in the Journal of Experimental Psychology: Human Perception and Performance point out. In fact, Luke Rosedahl at UC, Santa Barbara and colleagues write, “This practice is so widely accepted that scholarship primarily focuses on how to provide instructions, not whether these instructions help or not.” Now the team reports that for learning how to do well at certain tasks, they do not help at all.

The team explored something that we do all the time: mentally sorting items into categories. They looked at two types of category sorting tasks. With a “rule-based” type, your best strategy for doing well is to learn and follow an explicit logical rule or rules. Let’s say you’re presented with a set of shapes, which you need to sort into two categories, and the rule is: all the yellow triangles belong to Category A, and everything else to Category B. (Rule-based tasks can get more complicated, but the rules are usually straightforward to explain and absorb.)

Then there are “information-integration” tasks. To do well at these tasks, we have to learn to integrate bits of information that we find less straightforward to process together consciously, and rely more instead on “implicit learning”. This type of learning is important for learning a language, for example, and underpins expert “gut feelings”. The best strategy to do well at these tasks is often difficult to describe verbally — which made the team wonder whether instructions actually help.

To explore this, they recruited a total of 58 students at UCSB. Half were given a rule-based task and half an information-integration task. Within each of these two groups, half were first told the rule that they needed to follow to do well, and half were not. So half of each group had to learn the rule for themselves.

Each participant was shown a series of bars of plots on a graph, which varied in thickness and angle of alignment. Each time, the participant had to identify a graph as “type A” or “type B”. Those in the rule-based group had to follow this rule to do well: “Respond A if the bars are thick and the orientation is low; otherwise respond B”. Those in the information-integration group had to follow this rule: “Respond A if the bar width is greater than the orientation; otherwise respond B.” Though this rule can be expressed verbally, it’s harder for a person (though not a computer) to grasp. As the researchers point out, “For humans, judging whether bar width is greater or less than bar orientation feels like comparing apples and oranges.”

Every time a participant assigned an image to a category, they got feedback on whether they were right or wrong. Whenever they made an error, those who’d first been told their rule were also given extra feedback on how they should have followed that rule, to pick the right category.

When the team looked at how the participants performed, the result was clear: receiving instructions dramatically improved performance for the rule-based task. But it had no impact whatsoever on the performance of the information-integration group.

“We are routinely taught that instructions are beneficial even in highly complex motor tasks,” the researchers comment. “Our results question this conventional wisdom.”

So what are the real world implications of these results?

The team doesn’t recommend ditching instructions for information-integration tasks entirely. There are difficult classification tasks that do require implicit learning to build expertise. The team highlights radiologists screening mammograms for follow-up, for example. An expert radiologist may have implicitly learned difficult-to-explain patterns of calcifications that would lead them to recommend further investigation. However, initial use of a rule that anyone with more than five calcifications per cubic centimetre should have further evaluation could “bootstrap” that learning, the team writes.

Still, they think their work does have implications for this type of training, for example: “Our results suggest that the development of expertise might be facilitated if instruction focused exclusively on the explicit components of expert classification, and the implicit components were improved exclusively through practice,” they write.

Perhaps further study involving more complex, real-world tasks is required before any drastic changes in training are made. But the finding that instructions don’t always help us to do better at a task is an important one.

When instructions don’t help: Knowing the optimal strategy facilitates rule-based but not information-integration category learning.

Emma Young (@EmmaELYoung) is a staff writer at BPS Research Digest