Murthy Chittaiah, Nithyanianjan
Unknown Affiliation

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

Optimizing sparse ternary compression with thresholds for communication-efficient federated learning Murthy Chittaiah, Nithyanianjan; Haladappa, Manjula Sunkadakatte
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 14, No 6: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijai.v14.i6.pp4902-4912

Abstract

Federated learning (FL) enables decentralized model training while preserving client data privacy, yet suffers from significant communication overhead due to frequent parameter exchanges. This study investigates how varying sparse ternary compression (STC) thresholds impact communication efficiency and model accuracy across the CIFAR-10 and MedMNIST datasets. Experiments tested thresholds ranging from 1.0 to 1.9 and batch sizes of 10, 15, and 20. Results demonstrated that selecting thresholds between 1.2 and 1.5 reduced total communication costs by approximately 10–15%, while maintaining acceptable accuracy levels. These findings suggest that careful threshold tuning can achieve substantial communication savings with minimal compromise in model performance, offering practical guidance for improving the efficiency and scalability of FL systems.