This study examined how well chatbot-enhanced training can help computer science students understand technical vocabulary. Semi-structured interviews were combined with a pre-test and post-test design as part of a mixed-methods strategy. With a substantial effect size (Cohen's d = 1.59) and a mean increase of 12.7%, quantitative data showed a statistically significant improvement in vocabulary scores, suggesting major practical implications. A thematic analysis of qualitative responses identified three main themes: usability limits, motivation and involvement, and perceived benefits. Students commended the chatbot's ability to offer real-time, contextual feedback and promoted deeper learning by using examples that are interwoven with coding situations. The conversational tone, individualized contact, and emotional engagement of the chatbot were credited with increasing motivation. However, some students pointed up issues including repeating outputs, overuse of synonyms, and complex instances, highlighting the necessity of adaptive content calibration. These results demonstrated that, with careful integration, AI-powered chatbots can function as efficient, customized vocabulary instructors; nevertheless, wider adoption will require enhancements to content delivery systems.
Copyrights © 2025