How do medical students actually think while solving problems in three different types of clinical assessments
MetadataShow full item record
In the context of medical education, clinical assessments, such as the objective structured clinical examination (OSCE) and modified essay question (MEQ), have been widely used. Although both have numerous advantages, minimizing the limitations of each assessment and assessing medical students’ diagnostic reasoning using a reasonable and affordable method is needed. This study was conducted as an exploration of an alternative way to conduct clinical assessments in the form of a multimedia case-based assessment (CBA). There are various research studies on investigating correlations between the OSCE and MEQ; however, little attention has been given to the types of thinking that medical students actually engage in during assessments. The purpose of the study was to identify medical students’ cognitive processes in solving diagnostic problems and to compare how they think differently in three different types of clinical assessments. A cross-case study was employed for this research. The study involved two 4th year medical students who had been videotaped taking the OSCE, CBA, and MEQ. Data were collected through one-on-one stimulated recall interviews where students were shown a video of themselves taking each assessment and asked to elaborate what they were thinking during each of the 20 partitioned clips of each video. Data were prepared with the smallest phrases or sentences representing a meaningful cognitive occurrence and coded using hypothetico-deductive reasoning (HDR) as representative of clinical reasoning. Any uncoded data were categorized as other cognitive occurrences, and then all data were reconstructed according to the chronology of the participant’s actual performances in the assessment. The study revealed that both research participants exhibited similar proportional frequencies for all types of cognitive occurrences; however, each type of clinical assessment presented different patterns of proportional frequencies of clinical reasoning process. Moreover, other cognitive occurrences that distract students’ clinical reasoning were also detected, such as test-taking strategy, point-seeking/hunting, and unnecessary constraints. As a result, suggestions for future research are provided. This study’s research design may be used to validate clinical assessments for diagnostic reasoning, and the results of this study can inform the redesign of clinical assessments, including multimedia case-based assessment, for medical education.