Rueckert-Hartman College for Health Professions
Doctor of Nursing Practice
Loretto Heights School of Nursing
Thesis - Open Access
Number of Pages
Background: Differences in testing can lead to student frustration and questionable evaluation of student learning. Inconsistency in testing practices include differing levels of Bloom’s taxonomy, Faculty-written items versus publisher test bank items, and variations in grading. Failure to provide quality student learning assessment and evaluation tools and methods can negatively impact both student and program outcomes. Performing analysis of test items can improve quality of and consistency in evaluation of student learning.
Objective/Purpose of the Project: This quality improvement DNP project investigated the impact of an educational intervention on multiple choice question (MCQ) test-item analysis on the Health Sciences Department Faculty of a small community college.
Methods: Seven of 66 Health Science Faculty participants attended a one-hour educational session on test item-analysis. Data were collected via a pre-test/post-test survey design. In addition to demographic data, a Likert scale was used to rate participants’ knowledge of, confidence in completing, and ability to overcoming barriers to implementing test item analysis. Participants also completed an item analysis on a practice test item, rating difficulty, discrimination, and plan for use. Finally, upon completion of the intervention, participants were asked to rate their intent to incorporate test item-analysis into their curricula. Paired sample t-tests and frequency distributions were used to describe results.
Results: Participants reported an increase in comprehension of (p = .030) and confidence in completing test item analysis (p = .008), as well as an increase in the ability to overcome barriers to implementing test item analysis (p = .030) following the educational session. Pre-education, 2(14.3%) correctly responded to item analysis difficulty, discrimination, and plan for use, compared to 3(43%) post education. 71.4% (N=5) strongly agreed to intent to incorporate item analysis into their curricula following education.
Conclusions/Application to Practice: Performing test item analysis as part of a program testing policy can improve the quality of evaluation of student learning, increase use of evidence-based practices in education and strengthen health science programs through consistency in grading practices.
Date of Award
© Tunisia Love
Love, Tunisia, "Improving Health Science Faculty's Comprehension of Test Item-Analysis" (2021). Regis University Student Publications. 1008.