Claim Missing Document
Check
Articles

A Comprehensive Review of Knowledge Distillation for Lightweight Medical Image Segmentation Asmat Burhan; Purwono, Purwono
Journal of Advanced Health Informatics Research Vol. 2 No. 2 (2024)
Publisher : Peneliti Teknologi Teknik Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.59247/jahir.v2i2.294

Abstract

Medical image segmentation plays a crucial role in computer-aided diagnosis by enabling precise identification of anatomical and pathological structures. While deep learning models have significantly improved segmentation accuracy, their high computational complexity limits deployment in resource-constrained environments, such as mobile healthcare and edge computing. Knowledge Distillation (KD) has emerged as an effective model compression technique, allowing a lightweight student model to inherit knowledge from a complex teacher model while maintaining high segmentation performance. This review systematically examines key KD techniques, including Response-Based, Feature-Based, and Relation-Based Distillation, and analyzes their advantages and limitations. Major challenges in KD, such as boundary preservation, domain generalization, and computational trade-offs, are explored in the context of lightweight model development. Additionally, emerging trends, including the integration of KD with Transformers, Federated Learning, and Self-Supervised Learning, are discussed to highlight future directions in efficient medical image segmentation. By providing a comprehensive analysis of KD for lightweight segmentation models, this review aims to guide the development of deep learning solutions that balance accuracy, efficiency, and real-world applicability in medical imaging