This research aims to develop an example of gait pattern segmentation between normal and disabled individuals. Walking is the movement of moving from one place to another, where individuals with physical limitations on the legs have different walking patterns compared to individuals without physical limitations. This study classifies gait into three categories, namely individuals with assistive devices (crutches), individuals without assistive devices, and normal individuals. The study involved 10 subjects, consisting of 2 individuals with assistive devices, 3 individuals without assistive devices, and 5 normal individuals. The research process was conducted through three main stages, namely: image database creation, data annotation, and model training and segmentation using YOLOv8. YOLOv8-seg is the platform used to segment the data. The test results showed that the YOLOv8L-seg model achieved convergence value at the 23rd epoch with the 4th scenario in recognizing the walking patterns of the three categories. However, research on walking patterns of people with disabilities faces several obstacles, such as the lack of confidence or emotion of the subject during the data collection process, which is conducted at the location of the subject's choice. In addition, YOLOv8-seg showed consistent performance across the five models used, obtaining a maximum mAP50 value of 0.995 for mAP50 box and mAP50 mask.