Claim Missing Document
Check
Articles

Found 2 Documents
Search

Enhancing IOT Security: A review of Machine Learning-Driven Approaches to Cyber Threat Detection: Enhancing IOT Security: A review of Machine Learning-Driven Approaches to Cyber Threat Detection Ali, Misbah; Aamir Raza; Malik Arslan Akram; Haroon Arif; Aamir Ali
Journal of Informatics and Interactive Technology Vol. 2 No. 1 (2025): April
Publisher : ACSIT

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.63547/jiite.v2i1.64

Abstract

Internet Of Things (IOT) is rapidly adopted and implemented across various industries. The fast growth of IOT devices poses a risk, as these devices are ideal targets to be breached and exploited. However, given the heterogeneous nature and resource limitations of IOT networks, the traditional security mechanisms often fail to provide the required security. This study investigates recent IOT security breaches and showcases vulnerabilities exploited by attackers, as well as their impact on consumer, industrial, and healthcare IOT systems. The proposed solutions through ML and DL-driven security are summarized for adaptive threat detection, anomaly-based intrusion prevention, and intelligent risk mitigation. We also analyzed different approaches based on ML and DL to identify and prevent cyber-attacks as an effective solution. These ML and DL – based research papers have been reviewed within the IEEE repository and the publications span from 2020 to 2024, ensuring current literature on IOT security. The results highlight that security models based on ML and DL techniques improve resilience against IOT by allowing real-time detection of attacks, reducing the volume of false positives, and adapting to new threats. Furthermore, this work identifies the existing barriers to the adoption of ML/DL technologies for IOT security and emphasizes the potential areas for future research that may solidify the overall security framework for IOT ecosystems.
Random Search-Based Parameter Optimization on Binary Classifiers for Software Defect Prediction Ali, Misbah; Azam, Muhammad Sohaib; Shahzad, Tariq
Jurnal Ilmiah Teknik Elektro Komputer dan Informatika Vol. 10 No. 2 (2024): June
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/jiteki.v10i2.28973

Abstract

Machine learning classifiers consist of a set of parameters. The efficiency of these classifiers in the context of software defect prediction is greatly impacted by the parameters chosen to execute the classifiers. These parameters can be optimized to achieve more accurate results. In this research, the efficiency of binary classifiers for software defect prediction is analyzed through parameter optimization using random search technique. Three heterogeneous binary classifiers i.e., Decision tree, Support vector machine, and Naïve Bayes are selected to examine the results of parameter optimization. The experiments were performed on seven publicly available NASA Datasets. The dataset was split into 70-30 proportions with class preservation. To evaluate the performance; five statistical measures have been implemented i.e., precision, recall, F-Measure, the area under the curve (AUC), and accuracy. The findings of the research revealed that there is significant improvement in accuracy for each classifier. On average, decision tree improved from 88.1% to 95.4%; support vector machine enhanced the accuracy from 94.3% to 99.9%. While Naïve Bayes showed an accuracy boost from 74.9% to 85.3%. This research contributes to the field of machine learning by presenting comparative analysis of accuracy improvements using default parameters and optimized parameters through random search. The results presented that he performance of binary classifiers in the context of software prediction can be enhanced to a great extent by employing parameter optimization using random search.