This study benchmarks the effectiveness of three oversampling techniques, namely SMOTE, Random Oversampling (ROS), and ADASYN, in enhancing machine learning performance for multiclass hypertension classification. Using key physiological features and four optimized algorithms Logistic Regression, Support Vector Machine, Linear Discriminant Analysis, and Artificial Neural Networks, model performance was assessed using accuracy, F1-macro, and ROC AUC metrics. The experimental results indicate that the combination of SMOTE and Linear Discriminant Analysis (LDA) yields the highest overall performance, achieving an accuracy of 0.9773 and an F1-macro score of 0.9848. Logistic Regression demonstrates optimal results when paired with ROS, also reaching an accuracy of 0.9773. Artificial Neural Networks show the most substantial performance improvement under ADASYN, particularly reflected in higher F1-macro values. Although Support Vector Machine is less sensitive to oversampling interventions, it achieves a strong ROC AUC score of 0.9776 when trained using SMOTE. Overall, the findings confirm that oversampling techniques significantly improve classification performance in multilevel hypertension prediction, with SMOTE combined with LDA emerging as the most effective configuration.