The effect of item and person misfit on selection decisions: An empirical study

Rob R. Meijer, Jorge N. Tendeiro

Research output: Book/ReportReportProfessional


Item response theory (IRT) is a mathematical model that is often applied in the development and analysis of educational and psychological assessments. Various IRT models exist, and practitioners must choose the model that is most appropriate for their particular assessment. Even when the most appropriate model is applied, the fit of the assessment data to the model is rarely perfect in practice. How serious, then, is model misfit for practical decision-making? In this study we analyze two empirical datasets with the aim of investigating the effect of removing misfitting items and misfitting item score patterns on the rank order of test takers according to their proficiency level score. Results for two different IRT models were compared. We found that the impact of removing misfitting items and item score patterns varied depending on the IRT model applied. This effect was more serious when selecting a small to moderate percentage of test takers from a group of test takers. When the percentage selected is larger, misfit is not important.
Original languageEnglish
Place of PublicationNewtown, PA, USA
Publisherlaw school admission council (USA)
Number of pages17
Publication statusPublished - Oct-2015

Publication series

NameLSAC Research Report Series
Publisher LSAC (Paw School Admission Council)


Dive into the research topics of 'The effect of item and person misfit on selection decisions: An empirical study'. Together they form a unique fingerprint.

Cite this