Identifying difficult topics and problematic subtopics using Item Response Theory (IRT) analysis of mastery mode homework questions in a national Online Web-Based Learning (OWL) database
Moody, John David
MetadataShow full item record
Item Response Theory (IRT) analysis on General Chemistry exams has been used successfully at the University of Georgia (UGA) to identify key topics that students find difficult. Our preliminary analysis of the Online Web-Based Learning (OWL) homework questions in Question mode has shown some overlap between UGA difficult topics and nationwide difficult topics. In addition three more national topics have been identified as difficult based on the preliminary analysis. Unfortunately, question mode questions only involved 15% of the total OWL responses. Many of the questions in the identified difficult topics had fewer than 300 responses. Therefore, we also conducted IRT analysis of the OWL homework database in Mastery (Question Pool) mode. Since most of the data we analyzed involved questions where students were allowed 15 or more attempts (and in many cases, unlimited attempts) to answer a question, we also had to determine which attempt had the question’s most normally distributed Total Information Curve (TIC). The parameters of the two modes were also compared to each other and to parameters from JExam homework. In addition, the difficult topics identified in each mode will be examined. We will also look at small differences between two questions that results in a large difficulty difference between the two questions, such as an equilibrium questions where there are no solids present in the equilibrium vs. a question where a solid is present in the equilibrium. Lastly, we will summarize the results and make suggestions for an instructor who wants to build homework and testing database where the downloading of response data is easy.