Claim Missing Document
Check
Articles

Found 21 Documents
Search

Structure learning of bayesian network using swarm intelligent algorithm: a review Kareem, Shahab Wahhab; Askar, Shavan; Ahmed, Kosrat Dlshad
Bulletin of Social Informatics Theory and Application Vol. 5 No. 2 (2021)
Publisher : Association for Scientific Computing Electrical and Engineering

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.31763/businta.v5i2.463

Abstract

Machines using Bayesian networks can be used to construct the framework of information in artificial intelligence that connects the variables in a probabilistic way. “Deleting, reversing, moving, and inserting” is an approach to finding the best answer to the proposition of problem in the algorithm. In the Enhanced Surface Water Searching Technique, mostly, the hunt for water is done by elephants during dry seasons, It is Pigeon Optimization, Simulated Annealing, Greedy search, and the BDeu metrics being reviewed in combination to evaluate all these strategies being used in order to solve this problem. They subjected different data sets to the uncertainty matrix in an investigation to find out which of these approaches performed best. According to evaluation data, the algorithm shows stronger results and delivers better points. Additionally, this article also represents the structure learning processes for Bayesian Network as well.
Control Traffic in SDN Systems by using Machine Learning techniques: Review Askar, Shavan; Hussein, Diana; Ibrahim, Media; Aziz Mohammed, Marwan
International Journal of Research and Applied Technology (INJURATECH) Vol. 5 No. 1 (2025): Vol 5 No 1 (2025)
Publisher : Universitas Komputer Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar

Abstract

Due to the rapid development of Internet and mobile communication technologies, which have spearheaded a fast growth of networking systems to become increasingly complex and diverse regarding infrastructure, devices, and resources. This requires further intelligence deployment to improve the organization, management, maintenance, and optimization of these networks. However, it is difficult to apply machine learning techniques in controlling and operating networks because of the inherent distributed structure of traditional networks. The centralized control of all network operations, holistic knowledge of the network, software-based monitoring of traffic, and updating of forwarding rules to enable the functions of (SDN) are factors that (SDN) has that facilitate the application of machine learning techniques. This study will make an extensive review of existing literature to be able to answer the research question of how machine learning techniques can be used in the context of the SDN. First, it gives a review of the foundational literature information. After this, a brief review of machine learning techniques is presented. We shall also delve into the application of machine learning techniques in the area of (SDN), with a sharp edge on traffic classification, prediction of Quality-of-Service (QoS), and optimization of routing and Quality-of-Experience (QoE) security management of the resource separately. Finally, we engage in discussions surrounding challenges and broader perspectives.
Deep Learning Security Schemes in IIoT: A Review Askar, Shavan; Hussein, Diana; Ibrahim, Media; Mohammed , Marwan Aziz
International Journal of Research and Applied Technology (INJURATECH) Vol. 5 No. 1 (2025): Vol 5 No 1 (2025)
Publisher : Universitas Komputer Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar

Abstract

The Industrial Internet of Things (IIoT) is a fast-growing technology that might digitize and connect numerous industries for substantial economic prospects and global GDP growth. By the fourth industrial revolution, Industrial Internet of Things (IIoT) platforms create massive, dynamic, and inharmonious data from interconnected devices and sensors. Security and data analysis are complicated by such large diverse data. As IIoT increases, cyberattacks become more diversified and complicated, making anomaly detection algorithms less successful. IIoT is utilized in manufacturing, logistics, transportation, oil and gas, mining, metallurgy, energy utilities, and aviation. IIoT offers significant potential for industrial application development, however cyberattacks and higher security requirements are possible. The enormous volume of data produced by IoT devices demands advanced data analysis and processing technologies like deep learning. Smart assembly, smart manufacturing, efficient networking, and accident detection and prevention are possible with DL algorithms in the Industrial Internet of Things (IIoT). These many applications inspired this article on DL's IIoT potential.
Comprehensive Review of Advanced Machine Learning Strategies for Resource Allocation in Fog Computing Systems Abdulwahab, Sara; Ibrahim, Media; Askar, Shavan; Hussien, Diana
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4632

Abstract

This paper targets the development of advanced machine learning strategies for fog computing systems and is designed to further enhance current mechanisms related to resource allocation. Fog computing represents the extension of cloud facilities to network edges with increased data processing, allowing minimal latency for applications that need real-time processing. This is a review underlining deep learning as one of the basic tools through which neural networks predict the resource usage and optimization of resource allocation with its dynamic adaptation to modifications within the network conditions. The paper reviews techniques such as Convolutional Neural Networks, Recurrent Neural Networks, and Generative Adversarial Networks that are explored for their roles in enhancing efficiency, privacy, and responsiveness within the realm of distributed environments. These findings reveal that deep learning significantly enhances operational performance, reduces latency, and strengthens security in fog networks. By processing data locally and autonomously managing resources, these strategies ensure efficient handling of diverse and dynamic demands. It concludes that the integration of machine learning into fog computing forms a scalable and robust framework toward meeting modern challenges imposed by digital ecosystems, enabling smarter real-time decision-making systems at the edge.
The Role of Deep Learning in Network Intrusion Detection Systems: A Review Abdullah, Rebwar; ibrahim, Media; askar, Shavan; hussein, Diana
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4734

Abstract

This review synthesizes findings from several key studies focusing on the role of deep learning (DL) in network intrusion detection systems (NIDS). It highlights the growing importance of using DL techniques to enhance the detection of complex and evolving cyber threats. Traditional methods such as signature-based systems or anomalous systems often fail to meet the accuracy of modern attacks, prompting researchers to explore DLs to improve accuracy and adaptability. Several studies have demonstrated the effectiveness of convolutional neural networks (CNNs), recurrent neural networks (RNNs), and deep belief networks (DBNs) in classifying network traffic and identifying malicious activities. These deep learning models are particularly valuable because of their ability to automatically learn features from raw data, reducing the need for manual feature engineering. The review emphasizes the challenges in training DL models, including the need for large, labelled datasets and addressing issues associated with false positives and model interpretability. Despite these challenges, DL-based NIDS have shown significant improvements in real-time threat detection and mitigation rates. However, there is ongoing research to optimize these models for better performance, scalability, and generalizability across different network environments. Overall, the integration of deep learning into NIDS represents a promising frontier in combating increasingly sophisticated cyberattacks.
Machine Learning Techniques for Enhancing Internet of Things (IoT) Performance A Review Hussein, Diana; Abdullah, Rebwar; Askar, Shavan; Ibrahim, Media
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4735

Abstract

The Internet of Things (IoT) is basically billions of interconnected smart devices that can communicate with little interference from humans, thus making life easier. The IoT is a fast-moving area of research, and the challenges are growing, thus requiring continuous improvement. As IoT systems become more challenging to improve, machine learning (ML) is increasingly incorporated into IoT systems to develop better capabilities. This article review explores several machine learning techniques aimed at enhancing the performance of IoT systems. It highlights the growing importance of integrating machine learning with IoT to address challenges such as data management, security, and real-time processing. The techniques discussed include supervised learning, unsupervised learning, reinforcement learning, deep learning, ensemble methods, anomaly detection, and federated learning. Each method is evaluated for its effectiveness in optimizing IoT applications, such as predictive maintenance, energy efficiency, and smart city solutions. The review emphasizes the potential of these techniques to improve decision-making processes, automate operations, and enhance user experiences. Additionally, it addresses the limitations and challenges associated with implementing machine learning in IoT environments, including data privacy concerns and the need for robust algorithms capable of handling diverse datasets. Overall, the article underscores the transformative role of machine learning in advancing IoT capabilities and suggests future research directions to further leverage these technologies for improved system performance and reliability.
A comparative evaluation for Detection Brain Tumor in MRI Image using Machine learning algorithms Kareem, Shahab; askar, shavan; Abdulkhaleq, Ibrahim; Hawezi, Roojwan Sc.
Jurnal Informatika UPGRIS Vol 7, No 2: Desember 2021
Publisher : Universitas PGRI Semarang

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26877/jiu.v7i2.9503

Abstract

In medical imaging, automated defect identification of defects has taken on a prominent position. Unaided prediction of tumor (brain) recognition in magnetic resonance imaging process (MRI) is vital for patient preparation. With traditional methods of identifying z is designed to reduce the burden on radiologists. One of the problems with MRI brain tumor diagnosis is the size and variation of their molecular structures. This article uses deep learning techniques (Artificial neural network ANN, Naive Bayes NB, Multi-layer Perceptron MLP ) to discover brain tumors in the MRI scans. First, the brain MRI images are run through the preprocessing steps to remove texture features. Next, these features are used to train a machine learning algorithm.
Comparative Analysis of XGBoost Performance for Text Classification with CPU Parallel and Non-Parallel Processing Ahmed Al-Zakhali, Omar; Zeebaree, Subhi; Askar, Shavan
The Indonesian Journal of Computer Science Vol. 13 No. 2 (2024): The Indonesian Journal of Computer Science
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v13i2.3798

Abstract

This paper shows the findings of a study that looks at how CPU parallel processing changes the way Extreme Gradient Boosting (XGBoost) classifies text. XGBoost models can sort news stories into set groups faster and more accurately, with or without CPU parallelism. This is the main goal of the study. The Keras dataset is used to prepare the text so that the TF-IDF (Term Frequency-Inverse Document Frequency) features can be found. These features will then be used to train the XGBoost model. This is used to check out two different kinds of the XGBoost classifier. There is parallelism between one of them and not it in the other. How well the model works can be observed by how accurate it is. This includes both how long it takes to learn and estimate and how well predictions work. The models take very different amounts of time to compute, but they are all pretty close in terms of how accurate they are. Parallel processing on the CPU has made tasks proceed more rapidly, and XGBoost is now better at making the most of that speed to do its task. The purpose of the study is to show that parallel processing can speed up XGBoost models without affecting their accuracy. This is helpful for putting text into categories.
Deep Learning Based Security Schemes for IoT Applications: A Review Othman, Mina; askar, shavan; Ali, Daban; Ibrahim, Media Ali; Abdullah, Nihad
The Indonesian Journal of Computer Science Vol. 13 No. 2 (2024): The Indonesian Journal of Computer Science
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v13i2.3839

Abstract

Due to its widespread perception as a crucial element of the Internet of the future, the Internet of Things (IoT) has garnered a lot of attention in recent years. The Internet of Things (IoT) is made up of billions of sentients, communicative "things" that expand the boundaries of the physical and virtual worlds. Every day, such widely used smart gadgets generate enormous amounts of data, creating an urgent need for rapid data analysis across a range of smart mobile devices. Thankfully, current developments in deep learning have made it possible for us to solve the issue tastefully. Deep models may be built to handle large amounts of sensor data and rapidly and effectively learn underlying properties for a variety of Internet of Things applications on smart devices. We review the research on applying deep learning to several Internet of Things applications in this post. Our goal is to provide insights into the many ways in which deep learning techniques may be used to support Internet of Things applications in four typical domains: smart industrial, smart home, smart healthcare, and smart transportation. One of the main goals is to seamlessly integrate deep learning and IoT, leading to a variety of novel ideas in IoT applications, including autonomous driving, manufacture inspection, intelligent control, indoor localization, health monitoring, disease analysis, and home robotics. We also go over a number of problems, difficulties, and potential avenues for future study that make use of deep learning (DL), which is turning out to be one of the most effective and appropriate methods for dealing with various IoT security concerns. The goal of recent research has been to enhance deep learning algorithms for better Internet of Things security. This study examines deep learning-based intrusion detection techniques, evaluates the effectiveness of several deep learning techniques, and determines the most effective approach for deploying intrusion detection in the Internet of Things. This study uses Deep Learning (DL) approaches to better expand intelligence and application skills by using the large quantity of data generated or acquired. The many IoT domains have drawn the attention of several academics, and both DL and IoT approaches have been explored. Because DL was designed to handle a variety of data in huge volumes and required processing in virtually real-time, it was indicated by several studies as a workable method for handling data generated by IoT.
Image Copyright Protection Based on Blockchain Technology Review Ali, Daban; Askar, Shavan; saleem, mohammed; Othman, Mina; Omer, Saman M.
The Indonesian Journal of Computer Science Vol. 13 No. 2 (2024): The Indonesian Journal of Computer Science
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v13i2.3840

Abstract

On a daily basis, a significant number of individuals distribute several photos and videos that have been marginally modified from the original material produced by copyright owners, such as photographers, graphic designers, and video producers. Individuals that infringe upon the rights of others, lacking the legal authority to access multimedia content, employ various digital image and picture manipulation techniques, it involves converting to gray scale, trimming, rotating, contracting the frame, and adjusting the background speed, to modify said content. Blockchain technology obviates the necessity of an intermediary, hence circumventing the possibility of a singular point of failure. Infractions to copyright poses a significant barrier to protecting commercial image and video information. The IPFS blockchain technology offers on-chain preservation for copyright information and off-chain storing for distinct multimedia files. The enhanced perceptual hashing algorithm significantly enhances the precision of identifying connections to identify digital image piracy. The photographers and designers that submit their photographs on websites are experiencing significant dissatisfaction due to a prevalent practice in which others attempt to claim credit and profit from the initial creator's effort.