Claim Missing Document
Check
Articles

Found 21 Documents
Search

Deep Learning Algorithms for Detecting and Mitigating DDoS Attacks Hamad, Soran; Askar, Shavan; Sami Khoshaba, Farah; Maghdid, Sozan; Abdullah, Nihad
The Indonesian Journal of Computer Science Vol. 13 No. 2 (2024): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v13i2.3847

Abstract

Raising the threat of Distributed Denial of Service (DDoS) attacks means that high and adapted detection tools are required now more than ever. This research focuses on exploring the latest solutions in preventing DDoS attacks and emphasizes how Artificial Intelligence (AI) is involved in enhancing end-to-end detection techniques. Through the analysis of several key approaches, this work notes that AI-guided models quickly identify and counteract any unusual traffic patterns that may indicate an oncoming DDoS attack. Essential aspects towards creating more resilient networks against such attacks include machine learning algorithms, sophisticated data analytics together with AI based detection systems for traffic pattern recognition. Importantly, AI does well in behavioral analysis because it can distinguish and adapt to changing attack vectors. Additionally, it puts AI into perspective as making positive mitigation strategies possible that contain quick interferences such as temporary halt of traffic, rerouting and targeted block listing with real time control panel operations. On the contrary, current DDoS detection prevention techniques remain critically addressed of persistent challenges and limitations fundamental to them. From what emerges, they should always be ready for innovation and improvement because of how attacks might evolve over time. This paper aligns itself with the position that AI-driven detection mechanisms are natural to network security against DDoS attacks. It underlines the importance of integrating AI-based solutions with conventional practices in order to enhance network resilience and efficiently counteract cyber threats that are evolving all the time.
Cyber Security Challenges in Industry 4.0: A Review Sami Khoshaba, Farah; Askar, Shavan; Hamad, Soran; Maghdid, Sozan
The Indonesian Journal of Computer Science Vol. 13 No. 2 (2024): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v13i2.3848

Abstract

In the era of Industry 4.0, when smart factories and networked systems are reshaping the landscape of industrial production, the protection of important data and information security is a top priority. Cyber-physical systems and the technology that supports it are the keys to Industry 4.0. It is founded on four essential design principles: interoperability, availability of information, technological assistance, and decentralized decision-making. These design principles, however, provide new weaknesses that could be exploited by bad people. To protect these systems from emerging dangers, great consideration should be given to the proactive and adaptive security measures, which will consequently enable the continuing growth and success of Industry 4.0 technologies. This paper will delve into the multifaceted challenges that Industry 4.0 presents in terms of data security and the emerging solutions and strategies required protecting vital information in this brave new world of manufacturing. The exploration of these challenges and the proposed solutions are essential for businesses and policymakers alike to navigate the complexities of data security and ensure the resilience of critical information in the digital age of Industry 4.0.
Signal Propagation and Path- Loss in 6G Mobile Telecommunication System Soran, Zhala; Askar, Shavan; Khosnawi, Dilshad; Saeed, Hasan
The Indonesian Journal of Computer Science Vol. 13 No. 2 (2024): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v13i2.3850

Abstract

As previous we know about many types of generation like (2G,3G and 4G network), and also it has been developed to the fifth – generation mobile communication system. Compare with 5G, 6G will be faster and lower latency, in 6G era, the theoretical network speed will reach to 1Tbps. That is 100 times the speed of the 5G, which is 10,000 times the speed of the current 4G. 5G is officially commercialized in 2019, but 5G has no yet covered a large area and have problems with high tariffs and incomplete network coverage ,6G network makes up for these problems. The 6G is through the integration of ground base stations and satellite communications, thus covering the whole world, it is also the real coverage. But also, there’s some issues with 6G networks, like here in this article I will explain the propagation signal for 6G and I will talk about path loss signals because it uses sub-millimeter waves or tera-hertz waves to make faster connection but it will cause so much issues in propagation filed. And explained how to address path-loss challenge.
Fog Computing in Next Generation Networks: A Review Khosnawi, Dilshad; Askar, Shavan; Soran, Zhala; Saeed, Hasan
The Indonesian Journal of Computer Science Vol. 13 No. 2 (2024): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v13i2.3851

Abstract

Cloud, Edge, and Fog computing has recently attracted significant attention in both industry and academia. However, finding their definition in computing paradigms and the correlation between them is difficult. In order to support modern computing systems, the cloud, edge devices, and fog computing offer high-quality services, lower latency, multi-tenancy, mobility support, and many other features. Fog/edge computing is an emerging computing paradigm that uses decentralized resources at the edge of a network to process data closer to user devices, like smartphones and tablets, as an alternative to using remote and centralized cloud data center resources. Fog networking or fogging is one of the best used models recently. By addressing this issue, this work serves as a valuable resource for those who will come after. Initially, we present an overview modern computing models and associated areas of interest research. After that, we discuss each paradigm. After that, we go into great detail about fog computing, highlighting its exceptional function as the link between edge, cloud, and IoT computing. Finally, we briefly outline open research questions and future directions in Edge, Fog, Cloud, and IoT computing.
Comparative Evaluation of VXLAN with Traditional Overlay Network Protocols Saeed, Hasan; Askar, Shavan; Soran, Zhala; Khosnawi, Dilshad
The Indonesian Journal of Computer Science Vol. 13 No. 2 (2024): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v13i2.3852

Abstract

This article examines various network virtualization technologies, including Virtual Extensible LAN (VXLAN), as well as overlay network protocols such as VXLAN-EVPN (Ethernet VPN) and VXLAN-LISP (Locator/Identifier Separation Protocol). These protocols play a crucial role in improving the scalability and flexibility of big cloud computing infrastructures. While each of these technologies can be employed to expand a Layer 2 connection across an already established network, they possess unique qualities and applications. The objective is to offer a comprehensive comprehension of these technologies and their suitability in diverse network contexts.VXLAN-EVPN has higher performance in terms of encapsulation speed and reduced packet overhead, rendering it highly suitable for high-speed and large-scale deployments. Conversely, VXLAN-LISP demonstrates superior network latency and interoperability, offering benefits in multi-tenant and geographically distributed networks. VXLAN can be combined with other widely used overlay network protocols, including Generic Network Virtualization Encapsulation (GENEVE), Stateless Transport Tunneling (STT), and Network Virtualization using Generic Routing Encapsulation (NVGRE). The objective is to offer a comprehensive comprehension of these technologies and their suitability in diverse network contexts.
Control Traffic in SDN Systems by using Machine Learning techniques: Review Askar, Shavan; Hussein, Diana; Ibrahim, Media; Aziz Mohammed, Marwan
International Journal of Research and Applied Technology (INJURATECH) Vol. 5 No. 1 (2025): Vol 5 No 1 (2025)
Publisher : Universitas Komputer Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar

Abstract

Due to the rapid development of Internet and mobile communication technologies, which have spearheaded a fast growth of networking systems to become increasingly complex and diverse regarding infrastructure, devices, and resources. This requires further intelligence deployment to improve the organization, management, maintenance, and optimization of these networks. However, it is difficult to apply machine learning techniques in controlling and operating networks because of the inherent distributed structure of traditional networks. The centralized control of all network operations, holistic knowledge of the network, software-based monitoring of traffic, and updating of forwarding rules to enable the functions of (SDN) are factors that (SDN) has that facilitate the application of machine learning techniques. This study will make an extensive review of existing literature to be able to answer the research question of how machine learning techniques can be used in the context of the SDN. First, it gives a review of the foundational literature information. After this, a brief review of machine learning techniques is presented. We shall also delve into the application of machine learning techniques in the area of (SDN), with a sharp edge on traffic classification, prediction of Quality-of-Service (QoS), and optimization of routing and Quality-of-Experience (QoE) security management of the resource separately. Finally, we engage in discussions surrounding challenges and broader perspectives.
Deep Learning Security Schemes in IIoT: A Review Askar, Shavan; Hussein, Diana; Ibrahim, Media; Mohammed , Marwan Aziz
International Journal of Research and Applied Technology (INJURATECH) Vol. 5 No. 1 (2025): Vol 5 No 1 (2025)
Publisher : Universitas Komputer Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar

Abstract

The Industrial Internet of Things (IIoT) is a fast-growing technology that might digitize and connect numerous industries for substantial economic prospects and global GDP growth. By the fourth industrial revolution, Industrial Internet of Things (IIoT) platforms create massive, dynamic, and inharmonious data from interconnected devices and sensors. Security and data analysis are complicated by such large diverse data. As IIoT increases, cyberattacks become more diversified and complicated, making anomaly detection algorithms less successful. IIoT is utilized in manufacturing, logistics, transportation, oil and gas, mining, metallurgy, energy utilities, and aviation. IIoT offers significant potential for industrial application development, however cyberattacks and higher security requirements are possible. The enormous volume of data produced by IoT devices demands advanced data analysis and processing technologies like deep learning. Smart assembly, smart manufacturing, efficient networking, and accident detection and prevention are possible with DL algorithms in the Industrial Internet of Things (IIoT). These many applications inspired this article on DL's IIoT potential.
Comprehensive Review of Advanced Machine Learning Strategies for Resource Allocation in Fog Computing Systems Abdulwahab, Sara; Ibrahim, Media; Askar, Shavan; Hussien, Diana
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4632

Abstract

This paper targets the development of advanced machine learning strategies for fog computing systems and is designed to further enhance current mechanisms related to resource allocation. Fog computing represents the extension of cloud facilities to network edges with increased data processing, allowing minimal latency for applications that need real-time processing. This is a review underlining deep learning as one of the basic tools through which neural networks predict the resource usage and optimization of resource allocation with its dynamic adaptation to modifications within the network conditions. The paper reviews techniques such as Convolutional Neural Networks, Recurrent Neural Networks, and Generative Adversarial Networks that are explored for their roles in enhancing efficiency, privacy, and responsiveness within the realm of distributed environments. These findings reveal that deep learning significantly enhances operational performance, reduces latency, and strengthens security in fog networks. By processing data locally and autonomously managing resources, these strategies ensure efficient handling of diverse and dynamic demands. It concludes that the integration of machine learning into fog computing forms a scalable and robust framework toward meeting modern challenges imposed by digital ecosystems, enabling smarter real-time decision-making systems at the edge.
The Role of Deep Learning in Network Intrusion Detection Systems: A Review Abdullah, Rebwar; ibrahim, Media; askar, Shavan; hussein, Diana
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4734

Abstract

This review synthesizes findings from several key studies focusing on the role of deep learning (DL) in network intrusion detection systems (NIDS). It highlights the growing importance of using DL techniques to enhance the detection of complex and evolving cyber threats. Traditional methods such as signature-based systems or anomalous systems often fail to meet the accuracy of modern attacks, prompting researchers to explore DLs to improve accuracy and adaptability. Several studies have demonstrated the effectiveness of convolutional neural networks (CNNs), recurrent neural networks (RNNs), and deep belief networks (DBNs) in classifying network traffic and identifying malicious activities. These deep learning models are particularly valuable because of their ability to automatically learn features from raw data, reducing the need for manual feature engineering. The review emphasizes the challenges in training DL models, including the need for large, labelled datasets and addressing issues associated with false positives and model interpretability. Despite these challenges, DL-based NIDS have shown significant improvements in real-time threat detection and mitigation rates. However, there is ongoing research to optimize these models for better performance, scalability, and generalizability across different network environments. Overall, the integration of deep learning into NIDS represents a promising frontier in combating increasingly sophisticated cyberattacks.
Machine Learning Techniques for Enhancing Internet of Things (IoT) Performance A Review Hussein, Diana; Abdullah, Rebwar; Askar, Shavan; Ibrahim, Media
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4735

Abstract

The Internet of Things (IoT) is basically billions of interconnected smart devices that can communicate with little interference from humans, thus making life easier. The IoT is a fast-moving area of research, and the challenges are growing, thus requiring continuous improvement. As IoT systems become more challenging to improve, machine learning (ML) is increasingly incorporated into IoT systems to develop better capabilities. This article review explores several machine learning techniques aimed at enhancing the performance of IoT systems. It highlights the growing importance of integrating machine learning with IoT to address challenges such as data management, security, and real-time processing. The techniques discussed include supervised learning, unsupervised learning, reinforcement learning, deep learning, ensemble methods, anomaly detection, and federated learning. Each method is evaluated for its effectiveness in optimizing IoT applications, such as predictive maintenance, energy efficiency, and smart city solutions. The review emphasizes the potential of these techniques to improve decision-making processes, automate operations, and enhance user experiences. Additionally, it addresses the limitations and challenges associated with implementing machine learning in IoT environments, including data privacy concerns and the need for robust algorithms capable of handling diverse datasets. Overall, the article underscores the transformative role of machine learning in advancing IoT capabilities and suggests future research directions to further leverage these technologies for improved system performance and reliability.