Thesis
Investigating item parameter drift across computer- and paper-based assessment modes in pisa 2015 mathematics
Washington State University
Master of Arts (MA), Washington State University
12/2019
DOI:
https://doi.org/10.7273/000000058
Handle:
https://hdl.handle.net/2376/119260
Abstract
International large-scale assessments (LSAs) are important for educational comparisons
among the countries. Programme for International Student Assessment (PISA) is an effective
international LSA. It assesses educational changes by country every three years since 2000. The
results inform policy makers about the development of 15-year-old students’ performance in
reading, mathematics, and science literacy. The validity of the test scores has gained more attention
in PISA 2015, because it included a transition from paper-based assessment (PBA) to computerbased assessment (CBA) in most countries. In order to make valid comparisons, an essential task
is to ensure measurement and test score equivalence between assessment modes. This study
investigated the item parameter drift (IPD) across the two assessment modes in the PISA 2015
mathematics test. The data consist of all participant students from PISA 2015 participating
countries and their responses to the 81 mathematics items administered via both assessment modes.
The analyses were performed using the iterative hybrid ordinal-logistic regression-IRT (OLR/IRT)
method.
The chi-square from the Likelihood ratio test, ∆β, and ∆R
2
criteria flagged as IPD items
and IPD forms. Based on the chi-square significance value, flagged 70 out 81 items uniform IPD, and 60 out of 81 items had non-uniform IPD. With the ∆β criterion, three dichotomous items out
of 81 mathematics revealed uniform IPD across the assessment modes. The ∆R
2
criteria did not
flag any item as an IPD item. These results revealed that assessment mode is one of the reasons
for IPD, which need to be considered in detail for the LSAs implementation process. The findings
indicate potential reasons of a reduced validity in test score interpretation with regarding to
assessment modes. The study provides useful information to consider for the item parameter
invariance when transition occurs between assessment modes. It suggests that item types and item
content domains should be considered while making transition of assessment modes.
Key Words: Item-parameter drift, Validity, Computer-based assessment, Paper-based
assessment, International large-scale assessments, PISA
Metrics
14 File views/ downloads
26 Record Views
Details
- Title
- Investigating item parameter drift across computer- and paper-based assessment modes in pisa 2015 mathematics
- Creators
- Mehriban Ceylan
- Contributors
- SHENGHAI DAI (Degree Supervisor) - Washington State University, UNKNOWNBRIAN F FRENCH (Committee Member) - Washington State University, UNKNOWNCHAD MARTIN GOTCH (Committee Member) - Washington State University, UNKNOWN
- Awarding Institution
- Washington State University
- Academic Unit
- Department of Kinesiology and Educational Psychology
- Theses and Dissertations
- Master of Arts (MA), Washington State University
- Publisher
- Washington State University
- Format
- pdf
- Number of pages
- 71
- Identifiers
- 99900591863001842
- Language
- English
- Resource Type
- Thesis