Autoencoder-based models have shown strong potential for anomaly detection in complex time-series data; however, they often assume equal importance across latent dimensions, resulting in inefficiencies and reduced precision. This study addresses this limitation by introducing the Entropy-Based Bit Allocation Quantizer (EBAQ), a novel quantization framework that adaptively allocates bits to each latent dimension based on its entropy, preserving more precision where information content is highest. The primary objective is to enhance representational efficiency and anomaly detection performance without increasing model complexity or computational cost. EBAQ is implemented as a plug-and-play module within a standard autoencoder architecture, requiring no retraining or architectural modification. The method was evaluated using a publicly available ECG dataset, where reconstruction-based anomaly detection was employed to assess its performance. Results show that EBAQ outperforms the standard autoencoder baseline, achieving higher accuracy (94.9%), precision (99.4%), and recall (91.4%), while also demonstrating more apparent separation between normal and anomalous data in latent space visualizations. These findings confirm that entropy-aware quantization improves both fidelity and interpretability in unsupervised anomaly detection. Overall, this work presents a theoretically grounded and practically efficient solution that bridges information theory and deep learning, offering a human-centered approach to developing more intelligent and efficient AI systems for real-world applications.
Copyrights © 2026