Claim Missing Document
Check
Articles

Found 2 Documents
Search
Journal : Engineering, Mathematics and Computer Science Journal (EMACS)

Leveraging Support Vector Machines and Ensemble Learning for Early Diabetes Risk Assessment: A Comparative Study Shiddiqi, Hafizh Ash; Setiawan, Karli Eka; Fredyan, Renaldy
Engineering, MAthematics and Computer Science Journal (EMACS) Vol. 7 No. 1 (2025): EMACS
Publisher : Bina Nusantara University

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.21512/emacsjournal.v7i1.12846

Abstract

Currently, diabetes is a hidden, serious threat to human lifestyles through daily food and drink, which has become a formidable global health challenge. As a contribution, this study suggests a way to use machine learning to find people with diabetes by looking at certain health parameters. It does this by using different Support Vector Machine (SVM)-based models, such as different SVMs with different kernels, such as linear, polynomial, radial basis function, and sigmoid kernels; different ensemble bagging with SVM; and different ensemble stacking with various SVM models. The findings demonstrated that utilizing a single SVM model with a linear kernel, ensemble bagging with a linear SVM, and ensemble stacking with different SVM models yielded the most accurate results, achieving 95% accuracy in both diabetes presence and absence. This lends credence to the idea that the incorporation of a linear kernel has the potential to improve the accuracy of determining whether or not diabetic illness is present.
Antiviral Medication Prediction Using A Deep Learning Model of Drug-Target Interaction for The Coronavirus SARS-COV Fredyan, Renaldy
Engineering, MAthematics and Computer Science Journal (EMACS) Vol. 6 No. 2 (2024): EMACS
Publisher : Bina Nusantara University

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.21512/emacsjournal.v6i2.11290

Abstract

Graph convolutional neural networks (GCNs) have shown promising performance in modeling graph data, particularly for small-scale molecules. Message-passing neural networks (MPNNs) are an important form of GCN variant. They excel at gathering and integrating particular information about molecules via several repetitions of message transmission. This capability has resulted in major advances in molecular modeling and property prediction. By combining the self-attention mechanism with MPNNs, there is potential to improve molecular representation while using Transformers' proven efficacy in other artificial intelligence disciplines. This research introduces a transformer-based message-passing neural network (T-MPNN) that is intended to improve the process of embedding molecular representations for property prediction. Our technique incorporates attention processes into MPNNs' message-passing and readout phases, resulting in molecular representations that are seamlessly integrated. The experimental results from three datasets show that T-MPNN outperforms or equals cutting-edge baseline models in tasks involving quantitative structure-property connections. By studying case studies of SARS-COV growth inhibitors, we demonstrate our model's ability to graphically depict attention at the atomic level. This enables us to pinpoint individual chemical atoms or functional groups linked with desirable biological properties. The model we propose improves the interpretability of classic MPNNs and is a useful tool for investigating the impact of self-attention on chemical substructures and functional groups in molecular representation learning. This leads to a better understanding of medication modes of action.