Hyperparameter optimization plays a crucial role in improving the performance of machine learning models, particularly in sleep disorder classification. However, searching for optimal hyperparameters often requires extensive computational resources and prolonged execution time. To address this issue, this study implements Optuna, a hyperparameter optimization framework based on the Tree-structured Parzen Estimator (TPE) and pruning mechanisms to enhance the efficiency of model configuration search adaptively. This study compares the performance of Support Vector Machine (SVM), Extreme Gradient Boosting (XGBoost), Random Forest, and Multi-Layer Perceptron (MLP) in classifying sleep disorders based on health and lifestyle variables. The data undergoes several preprocessing steps, including handling missing values, encoding, normalization (StandardScaler), and class balancing using SMOTE. The models are then developed and optimized using Optuna to determine the best hyperparameter configurations. Evaluation is conducted using Accuracy, Precision, Recall, and F1-score. Experimental results show that before optimization, the Random Forest model achieved an accuracy of 94%, XGBoost 96%, SVM 93%, and MLP 96%. After being optimized with Optuna, accuracy increased to 97% for Random Forest, 97% for XGBoost, 98% for SVM, and 97% for MLP. This improvement indicates that Optuna effectively enhances model performance, especially for SVM, which experienced the most significant accuracy boost after optimization. Thus, the use of Optuna not only accelerates hyperparameter tuning but also improves the efficiency and accuracy of machine learning models in sleep disorder classification. This approach has great potential in supporting AI-based medical diagnosis systems, enabling faster and more accurate detection of sleep disorders.