Stress is a pervasive condition that affects mental health, productivity, and quality of life across populations. Traditional methods for stress assessment, such as the Perceived Stress Scale (PSS), rely on retrospective self-reporting and are limited by subjectivity and delayed feedback. To address this gap, this study developed an integrated real-time stress monitoring system combining Galvanic Skin Response (GSR) sensors, Internet of Things (IoT) technology, and machine learning algorithms. Primary GSR data were collected from 30 participants under varied conditions, supplemented by secondary data from the WESAD dataset. A Random Forest classifier was employed to categorize stress into four levels: normal, mild, moderate, and severe. To address class imbalance, the Synthetic Minority Over-sampling Technique (SMOTE) was applied, leading to improved model robustness. The system achieved a cross-validated classification accuracy of 69%, with substantial improvements in the detection of moderate and severe stress cases compared to traditional threshold-based methods. A strong agreement (Cohen’s Kappa κ = 0.82) was observed between system predictions and PSS-based stress assessments. Feature importance analysis identified mean GSR value and Skin Conductance Response (SCR) amplitude as the most influential indicators of stress. The system was evaluated for usability, receiving high user ratings in terms of accessibility, simplicity, and interactivity. A simple Python-based command-line interface (CLI) was also developed for real-time stress prediction based on input features. This research demonstrates the feasibility and effectiveness of combining physiological sensing, predictive analytics, and user-friendly interfaces to enable scalable and adaptive stress monitoring. Future developments will focus on integrating additional physiological modalities and deep learning techniques to enhance predictive performance and personalization in clinical and everyday contexts.
                        
                        
                        
                        
                            
                                Copyrights © 2025