Reducing Differential Item Functioning via Process Data
Ling Chen, Susu Zhang, Jingchen Liu
[stat.AP]
Testing fairness is a major concern in psychometric and educational research. A typical approach for ensuring testing fairness is through differential item functioning (DIF) analysis. DIF arises when a test item functions differently across subgroups that are typically defined by the respondents’ demographic characteristics. Most of the existing research has focused on the statistical detection of DIF, yet less attention has been given to reducing or eliminating DIF and understanding why it occurs. Simultaneously, the use of computer-based assessments has become increasingly popular. The data obtained from respondents interacting with an item are recorded in computer log files and are referred to as process data. Process data provide valuable insights into respondents’ problem-solving strategies and progress, offering new opportunities for DIF analysis. In this paper, we propose a novel method within the framework of generalized linear models (GLMs) that leverages process data to reduce and understand DIF. Specifically, we construct a nuisance trait surrogate with the features extracted from process data. With the constructed nuisance trait, we introduce a new scoring rule that incorporates respondents’ behaviors captured through process data on top of the target latent trait. We demonstrate the efficiency of our approach through extensive simulation experiments and an application to thirteen Problem Solving in Technology-Rich Environments (PSTRE) items from the 2012 Programme for the International Assessment of Adult Competencies (PIAAC) assessment.