The paper outlines the development of an algorithm focused on imitating movements of a human arm and replicating strokes generated by the user's hand within a working environment. The algorithm was crafted to discern the position of either the user's left or right arm, tracking each section (fingers, wrist, elbow, and shoulder) through a detection and tracking system. These movements are then replicated onto a virtual arm, simulating the actions of a cutting tool, generating strokes as it moves. Convolutional neural networks (CNNs) were employed to detect and classify each arm section, while geometric analysis determined the rotation angles of each joint, facilitating the virtual robot's motion. The stroke replication program achieved an 84.2% accuracy in stroke execution, gauged by the closure of the polygon, distance between initial and final drawing points, and generated noise, which was under 10%, with a 99% probability of drawing a closed polygon. A Fast region-based convolutional neural network (Fast R-CNN) network detected each arm section with 60.2% accuracy, producing detection boxes with precision ranging from 17% to 59%. Any recognition shortcomings were addressed through mathematical estimation of missing points and noise filters, resulting in a 90.4% imitation rate of human upper limb movement.
Copyrights © 2025