Show simple item record

dc.contributor.authorDaymude, Lois Kathleen Hollister
dc.date.accessioned2014-03-04T18:58:50Z
dc.date.available2014-03-04T18:58:50Z
dc.date.issued2010-12
dc.identifier.otherdaymude_lois_k_201012_phd
dc.identifier.urihttp://purl.galileo.usg.edu/uga_etd/daymude_lois_k_201012_phd
dc.identifier.urihttp://hdl.handle.net/10724/26879
dc.description.abstractStudents’ low test scores in mathematics can be disappointing for students, teachers, and parents. Data from analyses of errors on mathematics tests have the potential to inform students, teachers, and parents about improving the processes of teaching and learning mathematics. Maximizing performance on tests enhances students’ academic success and opportunities. This study addressed high school students’ and their teacher’s analysis of errors on mathematics tests using a theoretical perspective of pragmatism. I coached 43 ninth-grade students in the fall of 2009 in the use of a tool to aid student metacognition, student test performance, and student learning through more informed teaching. This mixed methods study used qualitative and quantitative methods to answer these questions: 1. What effects does the use of the test-error analysis tool have on students’ mathematics test-related behavior (i.e., preparing for and taking tests) and outcomes (e.g., errors made, points lost, test scores)? 2. What benefits and drawbacks do my students, their parents, and I perceive from the use of the test error analysis tool with mathematics tests? In particular, what do the students and I learn from the analysis? 3. What are the most common types of errors, according to the analysis, in a mathematics course? How does this information inform the students and me to promote the learning of mathematics? 4. What groupings, patterns, and trends can be observed from test error analysis data? For example, do the frequencies of some error types decrease? If latent groups are identified, what are their characteristics and what are the probabilities that students move from one group to another? The study used Excel, MPlus, Fathom, and Minitab for quantitative analysis and coding for qualitative analysis, integrating the results for conclusions. The most common error types were the following: not knowing how, knowing how but forgetting, making arithmetic errors, and running out of time. Testing process errors tended to improve; mathematical content errors worsened slightly as the content got more difficult over the semester. Students cumulative test scores were better than their unit test scores, indicating a possible benefit of the test error analysis process. Students whose grades were in the middle of the class tended to benefit more from the analyses than struggling or excelling students. Information for the parents of struggling students and for the teacher for future instruction and assessment was very helpful.
dc.languageeng
dc.publisheruga
dc.rightspublic
dc.subjectTest error analysis
dc.subjectStudent analysis
dc.subjectAssessment
dc.subjectAlgebra
dc.subjectError types
dc.subjectImproving instruction
dc.subjectParent communication
dc.subjectMetacognition
dc.subjectMixed methods
dc.subjectMathematics education
dc.subjectSecondary
dc.subjectHigh school
dc.subjectLatent class analysis
dc.subjectMPlus
dc.subjectFathom
dc.subjectMinitab
dc.titleTest error analysis in mathematics education
dc.title.alternativea mixed methods study
dc.typeDissertation
dc.description.degreePhD
dc.description.departmentMathematics and Science Education
dc.description.majorMathematics Education
dc.description.advisorJohn Olive
dc.description.committeeJohn Olive
dc.description.committeeJeremy Kilpatrick
dc.description.committeeAllan Cohen


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record