Iron is one of the most prevalent groundwater contaminants and can cause significant aesthetic, operational, and infrastructural problems when present at elevated concentrations. This study aims to (i) experimentally evaluate the effects of pH, dissolved oxygen (DO), and sodium hypochlorite (NaOCl) dosage on iron removal efficiency, and (ii) develop an interpretable Gene Expression Programming (GEP) model to predict the optimal NaOCl dose under varying water-quality conditions. Laboratory jar test experiments demonstrated that iron oxidation is strongly pH-dependent, with maximum removal efficiency (up to 99%) achieved under acidic conditions (pH 4) at a NaOCl dose of 6 mg/L due to hypochlorous acid predominance. Under practical near-neutral conditions relevant to drinking-water treatment (pH 6.5–7.5), aeration alone enhanced iron removal as DO increased, although diminishing returns were observed beyond 6 mg/L DO because of increased energy demand. A combined treatment strategy involving low-dose pre-chlorination followed by aeration exhibited a clear synergistic effect, achieving iron removal efficiencies of approximately 85–89% using NaOCl doses of 1–3 mg/L and DO levels of 4–5 mg/L. This approach reduced overall operational costs by approximately 40% compared with aeration-only treatment. The developed GEP model showed strong predictive performance (R² = 0.94; RMSE = 0.34 mg/L) and generated explicit mathematical expressions linking oxidant demand to pH, DO, and influent iron concentration. Overall, the study confirms the technical and economic advantages of pre-chlorination combined with aeration and highlights the potential of GEP as a transparent decision-support tool for optimizing groundwater iron removal.
Copyrights © 2026