High-stakes testing can be frustrating for students who are confident in their knowledge of the test content but are still unable to pass the examination. One potential reason for this difficulty is a misunderstanding of the test questions. This research utilizes the Language Variation Tool for physical therapist students/graduates (LVT-PT), and builds on work that was previously published in the Journal of Interdisciplinary Education in 2011 and 2016 (Housel et al., 2011; Michaels et al., 2016).
Language and communication skills have been cited as major contributors to student success and attrition (Bosher & Smalkoski, 2002; Bosher, 2003). Performance on a high-stakes examination that is verbose may be more dependent upon reading comprehension than content knowledge. (Haladyna, 2004; Schellenberg, 2004). Students who demonstrate advanced knowledge in course content, will still sometimes report difficulty passing multiple choice examinations, simply because they misunderstand the test questions. Although a multitude of interacting variables could lead to discrepancies in test scores, this research focuses on English language variation of multiple choice questions.
The 2011 Study
In 2007, Housel and associates developed a diagnostic multiple-choice testing instrument called the language variation tool (LVT-PT), to help determine whether or not a student in a graduate school physical therapy program was affected by the wording of questions during multiple choice testing. A pilot study was performed using the LVT-PT at Tennessee State University (TSU) for the doctoral physical therapist (DPT) graduating classes of 2008, 2009 and 2010 (Housel, 2011). Although it was found that there were some students from all cultural proclivities who had difficulty answering questions when they were re-worded, there was a higher percentage of African American students with this problem (with an intercept of p=.000).
The 2016 Study
It became important to know the predictive effectiveness of the LVT-PT for future high stakes testing, specifically the National Physical Therapy Examination (NPTE). A study conducted by Michaels et al. (2016) demonstrated that there was predictive value to the LVT-PT. The creators of the NPTE have published a practice examination called the Practice Examination Assessment Tool (PEAT) that uses language very close to that of the NPTE (FSBPT, 2015). The PEAT was found to correlate highly with the NPTE results by Barredo, Tan, and Raynes (2015, preliminary result). Because the scores from the PEAT were more convenient to access than the scores for the NPTE, the PEAT was used in this study. There was a moderately strong correlation when comparing the scores of the LVT-PT to the PEAT. (Pearson correlation=.662; p=.000). This correlation was stronger than that of other practice tools that don't use the specific language variation used in the PEAT and the NPTE (correlation=.392; p=.029). Problems with language variation during multiple choice testing may be more common than originally thought. During the earlier study, 22 of the 35 students scored a minimum difference of 15% or higher on the LVT (Michaels et al., 2016). This was 62.8% of the class (Figure 1).
Importance of Intervention
The first step to intervention is feedback. Butler and Roediger (2008) found that the provision of feedback after testing led to an increased proportion of correct responses on later testing. Once success can be predicted, feedback can be provided, and then assistance can be implemented.
The LVT was created in 2008, with an outline created from Domholdt's Rehabilitation Research: Principles and Applications, 3rd ed. (Domholdt, 2004). The recommended steps include drafting, expert review, a first revision, a pilot test, and a final revision. Although the creation of the LVT is quickly summarized in this article, a more comprehensive depiction of tool's development is outlined in the 2011 article (Housel et al., 2011)...