dc.contributor.author | Jennings, Jeremy Kyle | |
dc.date.accessioned | 2014-09-25T04:30:17Z | |
dc.date.available | 2014-09-25T04:30:17Z | |
dc.date.issued | 2014-05 | |
dc.identifier.other | jennings_jeremy_k_201405_ms | |
dc.identifier.uri | http://purl.galileo.usg.edu/uga_etd/jennings_jeremy_k_201405_ms | |
dc.identifier.uri | http://hdl.handle.net/10724/30488 | |
dc.description.abstract | Each fall and spring semester approximately 1300 students take the Introductory Statistics course (STAT 2000) at the University of Georgia. The course coordinator maintains a repository of test questions from which he selects questions for each test. Having an item bank from which to pull items raises the question, “Are all of these items properly assessing students’ statistical achievement?” An IRT analysis to calibrate the items used on the third test of the semester will be performed. Each semester the third test assesses student knowledge and understanding of statistical inference for one sample. The best fit model for each item will be determined. The items that are not functioning properly will be reviewed to see if there are any common factors in these questions to give us an idea of what might be problem areas for students and recommendations based on the analysis will be made. | |
dc.language | eng | |
dc.publisher | uga | |
dc.rights | public | |
dc.subject | Item Response Theory | |
dc.subject | Introductory Statistics | |
dc.subject | Educational Measurement | |
dc.subject | Statistics Education | |
dc.title | Calibrating test item banks for an introductory statistics course | |
dc.type | Thesis | |
dc.description.degree | MS | |
dc.description.department | Statistics | |
dc.description.major | Statistics | |
dc.description.advisor | Jennifer Kaplan | |
dc.description.committee | Jennifer Kaplan | |
dc.description.committee | George Englehard | |
dc.description.committee | Jeongyoun Ahn | |