Support vector machine (SVM) is a pivotal classification algorithm, and its evolutionary counterpart, the twin SVM (TWSVM), has gained acclaim for its advanced generalization capabilities, particularly in handling imbalanced data. TWSVMs achieve swift training by explicitly exploring a pair of non-parallel hyperplanes, yet selecting numerical values for hyperparameters poses a challenge due to the uncertainty introduced by random preferences. This paper presents a novel approach, the Chebyshev distance-based TWSVM, specifically designed for hyperparameter tuning in imbalanced binary classification. This innovative model mitigates the uncertainty of hyperparameter selection by leveraging Chebyshev distance, thereby enhancing the generalization capabilities of the TWSVM. To evaluate its efficacy, computational tests were conducted on publicly accessible real-world benchmark datasets across various domains, including non-linear cases. The results demonstrate that the Chebyshev distance-based TWSVM outperforms several existing methods, achieving superior performance with reduced computational time and setting a new benchmark in the field.
Copyrights © 2025