Braille digitization plays an important role in improving access to written information for visually impaired individuals. However, automatic recognition of Braille text remains challenging due to the small size, dense spatial arrangement, and subtle variations of Braille dot patterns. This study proposes a syllable-based Braille cell detection and recognition framework using a customized Faster R-CNN architecture. The system employs a Region Proposal Network (RPN) to localize individual Braille cells, followed by an AlexNet-based classifica-tion network to recognize syllable-level patterns. The proposed method is evaluated on a syllable-level Braille dataset covering 50 syllable classes, and a two-stage training strategy is adopted to improve bounding box localization accuracy. Experimental results show stable training convergence and consistent classification performance across syllable categories. Confusion matrix analysis indicates that most misclassifications occur among syllables with visually similar dot configurations. Despite sensitivity to variations in physical Braille quality, the proposed framework indicates potential applicability in accessibility-oriented Braille digitization systems
Copyrights © 2026