Background: Medical waste management in resource-limited healthcare facilities remains dominated by manual segregation, which is error-prone and difficult to standardize. Existing automated solutions often rely on cloud-based deep learning or high-cost hardware, limiting real-time deployment at the point of waste generation. Objective: This study aimed to develop and evaluate a medical waste classification system integrating Tiny Machine Learning (TinyML) and multi-sensor fusion on a low-cost embedded device to achieve accurate, real-time, and resource-efficient on-device inference. Method: An experimental system design approach was employed, including dataset construction, model development, and embedded deployment. A TinyML-optimized MobileNetV2 model was integrated with heterogeneous sensor fusion and evaluated under embedded constraints to assess classification performance, latency, and memory usage. Result: The vision-only model achieved an accuracy of 84.5%, with frequent misclassification of sharps waste. After integrating sensor fusion, overall accuracy increased to 96.5%, and recall for sharps reached 98%. The system demonstrated efficient on-device inference with an average latency of 280 ms and low memory consumption (<1 MB). Conclusion: The proposed TinyML-based sensor fusion system provides a robust, accurate, and cost-effective solution for automated medical waste classification. This approach enhances healthcare worker safety and supports scalable deployment in resource-limited healthcare environments.
Copyrights © 2026