|dc.description.abstract||Students’ low test scores in mathematics can be disappointing for students, teachers, and parents. Data from analyses of errors on mathematics tests have the potential to inform students, teachers, and parents about improving the processes of teaching and learning mathematics. Maximizing performance on tests enhances students’ academic success and opportunities. This study addressed high school students’ and their teacher’s analysis of errors on mathematics tests using a theoretical perspective of pragmatism. I coached 43 ninth-grade students in the fall of 2009 in the use of a tool to aid student metacognition, student test performance, and student learning through more informed teaching. This mixed methods study used qualitative and quantitative methods to answer these questions:
1. What effects does the use of the test-error analysis tool have on students’ mathematics test-related behavior (i.e., preparing for and taking tests) and outcomes (e.g., errors made, points lost, test scores)?
2. What benefits and drawbacks do my students, their parents, and I perceive from the use of the test error analysis tool with mathematics tests? In particular, what do the students and I learn from the analysis?
3. What are the most common types of errors, according to the analysis, in a mathematics course? How does this information inform the students and me to promote the learning of mathematics?
4. What groupings, patterns, and trends can be observed from test error analysis data? For example, do the frequencies of some error types decrease? If latent groups are identified, what are their characteristics and what are the probabilities that students move from one group to another?
The study used Excel, MPlus, Fathom, and Minitab for quantitative analysis and coding for qualitative analysis, integrating the results for conclusions. The most common error types were the following: not knowing how, knowing how but forgetting, making arithmetic errors, and running out of time. Testing process errors tended to improve; mathematical content errors worsened slightly as the content got more difficult over the semester. Students cumulative test scores were better than their unit test scores, indicating a possible benefit of the test error analysis process. Students whose grades were in the middle of the class tended to benefit more from the analyses than struggling or excelling students. Information for the parents of struggling students and for the teacher for future instruction and assessment was very helpful.||