Examining teachers’ knowledge on a large scale involves addressing substantial measurement and logistical issues; thus, existing teacher knowledge assessments have mainly consisted of selected-response items because of their ease of scoring. Although open-ended responses could capture a more complex understanding of and provide further insights into teachers’ thinking, scoring these responses is expensive and time consuming, which limits their use in large-scale studies. In this study, we investigated whether a novel statistical approach, topic modeling, could be used to score teachers’ open-ended responses and if so, whether these scores would capture nuances of teachers’ understanding. To test this hypothesis, we used topic modeling to analyze teachers’ responses to a proportional reasoning task and examined the associations of the topics identified through this method with categories identified by a separate qualitative analysis of the same data as well as teachers’ performance on a measure of ratios and proportional relationships. Our findings suggest that topic modeling seemed to capture nuances of teachers’ responses and that such nuances differentiated teachers’ performance on the same concept. We discuss the implications of this study for education research.
Copur-Gencturk, Y. Choi, H.-J., Kim, S. & Cohen, A.S. (2022). Investigating teachers’ understanding through topic modeling: a promising approach to studying teachers’ knowledge. Journal of Mathematics Teacher Education, 26, 281-302. https://doi.org/10.1007/s10857-021-09529-w