First Advisor

Lora Claywell

College

Rueckert-Hartman College for Health Professions

Degree Name

Doctor of Nursing Practice

School

Loretto Heights School of Nursing

Division

Nursing

Document Type

Thesis - Open Access

Number of Pages

51 pages

Abstract

Background: Differences in testing can lead to student frustration and questionable evaluation of student learning. Inconsistency in testing practices include differing levels of Bloom’s taxonomy, Faculty-written items versus publisher test bank items, and variations in grading. Failure to provide quality student learning assessment and evaluation tools and methods can negatively impact both student and program outcomes. Performing analysis of test items can improve quality of and consistency in evaluation of student learning.

Objective/Purpose of the Project: This quality improvement DNP project investigated the impact of an educational intervention on multiple choice question (MCQ) test-item analysis on the Health Sciences Department Faculty of a small community college.

Methods: Seven of 66 Health Science Faculty participants attended a one-hour educational session on test item-analysis. Data were collected via a pre-test/post-test survey design. In addition to demographic data, a Likert scale was used to rate participants’ knowledge of, confidence in completing, and ability to overcoming barriers to implementing test item analysis. Participants also completed an item analysis on a practice test item, rating difficulty, discrimination, and plan for use. Finally, upon completion of the intervention, participants were asked to rate their intent to incorporate test item-analysis into their curricula. Paired sample t-tests and frequency distributions were used to describe results.

Results: Participants reported an increase in comprehension of (p = .030) and confidence in completing test item analysis (p = .008), as well as an increase in the ability to overcome barriers to implementing test item analysis (p = .030) following the educational session. Pre-education, 2(14.3%) correctly responded to item analysis difficulty, discrimination, and plan for use, compared to 3(43%) post education. 71.4% (N=5) strongly agreed to intent to incorporate item analysis into their curricula following education.

Conclusions/Application to Practice: Performing test item analysis as part of a program testing policy can improve the quality of evaluation of student learning, increase use of evidence-based practices in education and strengthen health science programs through consistency in grading practices.

Date of Award

Spring 2021

Location (Creation)

Colorado (state); Denver (county); Denver (inhabited place)

Rights Statement

All content in this Collection is owned by and subject to the exclusive control of Regis University and the authors of the materials. It is available only for research purposes and may not be used in violation of copyright laws or for unlawful purposes. The materials may not be downloaded in whole or in part without permission of the copyright holder or as otherwise authorized in the “fair use” standards of the U.S. copyright laws and regulations.

Share

COinS