This study examines Differential Item Functioning (DIF) in an instrument developed to measure preservice mathematics teachers’ ethnomathematical knowledge in Jambi. Anchored in the Rasch measurement framework, this study employs a quantitative ex post facto design to evaluate item‑level measurement invariance of an ethnomathematics assessment instrument. Rasch‑based procedures: item and person parameter estimation, fit statistics, and DIF analyses were employed to assess item equivalence across gender, educational background, and teaching experience, and to quantify the magnitude of any detected DIF. Data from 150 preservice teachers were analyzed using Rasch estimation and Mantel–Haenszel DIF on a culturally contextualized ethnomathematics instrument. Descriptive analyses indicated strong psychometric properties of the instrument, with person reliability = 0.81, item reliability = 0.97, and fit statistics falling within acceptable ranges. DIF analyses revealed several items with statistically significant DIF predominantly across gender, while the majority of items demonstrated stable functioning across the examined participant subgroups. Graphical DIF plots revealed heterogeneous item‑response patterns in the ethnomathematics instrument, highlighting the need for culturally responsive item revision to reduce bias. Findings demonstrate the Rasch model’s effectiveness in detecting item‑level bias in instruments measuring preservice teachers’ ethnomathematical knowledge and substantiate targeted refinement of measurement instruments.
Copyrights © 2026