Home Publications Research Teaching Contact

Intelligent Feedback on Hypothesis Testing

Sietske Tacoma, Bastiaan Heeren, Johan Jeuring, and Paul Drijvers

Hypothesis testing involves a complex stepwise procedure that is challenging for many students in introductory university statistics courses. In this paper we assess how feedback from an Intelligent Tutoring System can address the logic of hypothesis testing and whether such feedback contributes to first-year social sciences students’ proficiency in carrying out hypothesis tests. Feedback design combined elements of the model-tracing and constraint-based modeling paradigms, to address both the individual steps as well as the relations between steps. To evaluate the feedback, students in an experimental group (N = 163) received the designed intelligent feedback in six hypothesis-testing construction tasks, while students in a control group (N = 151) only received stepwise verification feedback in these tasks. Results showed that students receiving intelligent feedback spent more time on the tasks, solved more tasks and made fewer errors than students receiving only verification feedback. These positive results did not transfer to follow-up tasks, which might be a consequence of the isolated nature of these tasks. We conclude that the designed feedback may support students in learning to solve hypothesis-testing construction tasks independently and that it facilitates the creation of more hypothesis-testing construction tasks.

In International Journal of Artificial Intelligence in Education, 30(4):616-636, 2020.

Full paper