Indonesian Journal of Electrical Engineering and Computer Science
Vol 40, No 2: November 2025

Enhancing the ternary neural networks with adaptive threshold quantization

Truong, Son Ngoc (Unknown)



Article Info

Publish Date
01 Nov 2025

Abstract

Ternary neural networks (TNNs) with weights constrained to –1, 0, and +1 offer an efficient deep learning solution for low-cost computing platforms such as embedded systems and edge computing devices. These weights are typically obtained by quantizing the real weight during the training process. In this work, we propose an adaptive threshold quantization method that dynamically adjusts the threshold based on the mean of weight distribution. Unlike fixed-threshold approaches, our method recalculates the quantization threshold at each training epoch according to the distribution of real valued synaptic weights. This adaptation significantly enhances both training speed and model accuracy. Experimental results on the MNIST dataset demonstrates a 2.5× reduction in training time compared to conventional methods, with a 2% improvement in recognition accuracy. On Google Speech Command dataset, the proposed method achieves an 8% improvement in recognition accuracy and a 50% reduction in training time, compared to fixed-threshold quantization. These results highlight the effectiveness of adaptive quantization in improving the efficiency of TNNs, making them well-suited for deployment on resource constrained edge devices.

Copyrights © 2025