Purpose – Fair assessment is a fundamental pillar for ensuring that evaluation results accurately reflect students' abilities without bias. This study aims to identify Differential Item Functioning (DIF) in the Final School Assessment (PAS) instrument for the Chemistry subject, based on four demographic variables: gender, family economic status, residential location, and school of origin.Methodology – This study uses a quantitative design with a descriptive-exploratory approach. The research subjects were responses from 1,840 twelfth-grade students at high schools in Maros Regency. Data analysis was conducted using an Item Response Theory (IRT) approach in R (version 2024.4.2, Build 764). After assumption tests (unidimensionality and local independence) and a model fit test, the 1-Parameter Logistic (1PL) Model was selected as the most suitable. DIF detection was performed using Raju’s Area Measures.Findings – The analysis results showed that the assumptions of unidimensionality and local independence were met. Out of 30 items, five showed statistically significant DIF (p < 0.05): one item based on gender (Item 11), one based on residential location (Item 26), and three based on economic status (Items 19, 23, and 24). No items showed DIF by school of origin. Although statistically detected, the effect size analysis showed that all DIF items fell into Category 'A' (negligible) according to ETS criteria.Contribution – This study provides empirical evidence regarding the fairness of an assessment instrument developed by a teacher association (MGMP). It highlights how non-academic factors can manifest as measurable differences in performance. This study affirms the importance of DIF analysis as a standard procedure in the quality assurance of assessment instruments to maintain fairness for all students.