The rapid development of artificial intelligence (AI) has opened new possibilities for enhancing user interaction within video games. This study presents the design and implementation of a button-based assistant system for the simulation game Story of Seasons: Friends of Mineral Town, aimed at simplifying repetitive player tasks and improving the overall gameplay experience. The proposed system leverages a hybrid approach that combines Machine Learning and Deep Learning techniques, specifically Optical Character Recognition (OCR) with Tesseract, object detection using a custom-trained YOLOv7 model, the A* pathfinding algorithm for navigation, and automated input control through scripting. The assistant is capable of reading in-game time, weather, and events directly from screen captures, recognizing non-player characters (NPCs), and automatically directing the player’s character to desired locations or NPCs based on contextual data such as day, time, and weather conditions. A database-driven module stores key information such as NPC schedules, favorite gifts, and daily events to enable informed decision-making and interaction automation. Comprehensive testing was conducted, including comparisons of pathfinding algorithms, model accuracy assessments, and user experience evaluations involving volunteers. Results showed high detection accuracy with YOLOv7 and positive user feedback on the assistant's interface and usability. Users reported a more streamlined and enjoyable gaming experience, especially in managing daily tasks and character interactions. This research demonstrates how a hybrid AI-based approach can be effectively applied to traditional video games, offering a foundation for future development in intelligent game assistance systems. The proposed methodology not only improves convenience but also provides insights into the practical integration of AI in user-centric game design.