Choice developing edition item multiple test third validating
We speculate that the specific nature of the CRT items helps build construct equivalence among the different response formats.
We recommend using the validated multiple-choice version of the CRT presented here, particularly the four-option CRT, for practical and methodological reasons.
Open-ended questions are more difficult to solve than multiple-choice ones for stem-equivalent items (i.e., that differ only by listing multiple choices), because presenting options enables a different array of cognitive strategies leading to increased performance (Bonner, ).
Second, we tested whether the CRT response format altered the well-established association between performance in the CRT and benchmark variables: belief bias, denominator neglect, paranormal beliefs, actively open-minded thinking and numeracy.
Third, we tested the psychometric quality of the different formats of the tests by comparing their internal consistency.
The Cognitive Reflection Test, measuring intuition inhibition and cognitive reflection, has become extremely popular because it reliably predicts reasoning performance, decision-making, and beliefs.
Across studies, the response format of CRT items sometimes differs, based on the assumed construct equivalence of tests with open-ended versus multiple-choice items (the ).