Internet surveys are an increasingly popular method for collecting data in psychology, for obvious reasons, but they have some serious shortcomings. How do you know if a participant read the instructions properly? What if they clicked through randomly, completed it drunk or maybe their cat walked across the keyboard? Now a possible solution has arrived in the form of a tool, called the UserActionTracer (UAT), developed by Stefan Stieger and Ulf-Dietrich Reips.
The UAT is a piece of code that tells the participant’s web browser to store information, including timings, on all mouse clicks (single and double), choices in drop-down menus, radio buttons, all inserted text, key presses and the position of the mouse pointer. Stieger and Reips tested this out with a survey of 1046 participants on the subject of instant messaging. The new tool revealed that 31 participants changed their reported age; 5.9 per cent made suspicious changes to opinions they’d given; 46 per cent clicked through at least some parts of the questionnaire at a suspiciously fast rate (mainly for so-called ‘semantic differential’ items in which the participant must choose a position between two contrasting adjectives); 3.6 per cent of participants left the questionnaire inactive for long periods; 6.3 per cent displayed excessive clicking; and 11 per cent showed excessive mouse movements (it’s that cat again).
As a way of checking the usefulness of this extra behavioural data, the researchers concentrated on the fraction of participants for whom they had access to a secondary source of information that could be used to verify the questionnaire answers. This showed that participants who’d displayed more suspicious behaviour while filling out the questionnaire also tended to provide answers that didn’t match up with the other information source.
‘Our study shows that the UAT was successful in collecting highly detailed information about individual answering processes in online questionnaires,’ Stieger and Reips said. Another application of the tool is in pre-testing of online questionnaires. Researchers could use the tool to test which items tend to prompt corrections or inappropriate click-throughs before rolling out a questionnaire to a larger sample.
Stieger, S., & Reips, U. (2010). What are participants doing while filling in an online questionnaire: A paradata collection tool and an empirical study. Computers in Human Behavior, 26 (6), 1488-1495 DOI: 10.1016/j.chb.2010.05.013