Automated feedback on the structure of hypothesis tests
Sietske Tacoma, Bastiaan Heeren, Johan Jeuring, and Paul Drijvers
Understanding the structure of the hypothesis testing procedure is challenging for many first-year university students. In this paper we investigate how providing automated feedback in an Intelligent Tutoring System can help students in an introductory university statistics course. Students in an experimental group (N=154) received elaborate feedback on errors in the structure of hypothesis tests in six homework tasks, while students in a control group (N=145) received verification feedback only. Effects of feedback type on student behavior were measured by comparing the number of tasks tried, the number of tasks solved and the number of errors in the structure of hypothesis tests between groups. Results show that the elaborate feedback did stimulate students to solve more tasks and make fewer errors than verification feedback only. This suggests that the elaborate feedback contributes to students’ understanding of the structure of the hypothesis testing procedure.
In Proceedings of the Eleventh Congress of the European Society for Research in Mathematics Education, pages 2969-2976, 2019.
Links