Claim Missing Document
Check
Articles

Found 7 Documents
Search

Dynamic Resource Allocation in Cloud Networks Using Deep Learning : A review Diana Hayder Hussein; Maqdid, Goran; Shavan Askar; Media Ali Ibrahim
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4597

Abstract

Resource allocation has been a very significant topic for both research and development over the last two decades. Given the increasing volume of data, the proliferation of connected devices, and the demand for seamless service delivery, optimal resource allocation has become a vital factor that influences cloud performance. Recently, deep learning-a subcategory of machine learning-seems to possess a great potential to answer this challenge by enabling predictive, adaptive, and self-organized resource allocation. For the first time, this review embraces all the major milestones achieved in dynamic resource allocation with a discussion on over 25+ peer-reviewed articles published from the year 2000 to 2024. This review has emphasized the use of CNNs, RNNs, and other variants of deep learning approaches. Such a review provides a better view of the potential benefits of the different methodologies by highlighting the pros and cons of each. It also covers the use cases, computational methodologies that discuss algorithmic novelty and challenges in scalability, latency, and energy efficiency. A summary of the development in tech was made by comparison in a table to give a meta-view for the top-ten studies. These findings have important implications for cloud service delivery in applications ranging from industrial automation to consumer-oriented applications. They showcase the vast possibilities of deep learning for changing cloud network operations through advanced optimization and point out several open issues, including the integration of federated and edge learning models that will be necessary to achieve improved decentralization and preservation of network information privacy.
Integration of Deep Learning Applications and IoT for Smart Healthcare Diana Hayder Hussein; Yousif Mohammed Ismail; Shavan Askar; Media Ali Ibrahim
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4611

Abstract

The integration of deep learning (DL) applications with the Internet of Things (IoT) has emerged as a transformative approach for advancing smart healthcare systems. This review synthesizes findings from seven research studies, each exploring the intersection of these technologies in improving healthcare delivery, patient monitoring, and medical decision-making. The paper highlights how IoT devices, including sensors and wearables, generate vast amounts of real-time health data, which DL models leverage for predictive analytics, diagnosis, and personalized treatment recommendations. Key areas explored include: Data Acquisition and Processing: IoT-enabled sensors play a critical role in collecting physiological data, such as heart rate, blood pressure, and glucose levels, which are then processed by DL algorithms to identify patterns and anomalies, Remote Patient Monitoring: The combination of IoT and DL facilitates continuous monitoring of chronic conditions and allows for real-time intervention, reducing hospital readmissions and enhancing patient independence.
Deep Learning Applications in Fog Computing Environments : a review Maqdid, Goran; Ibrahim, Media Ali Ibrahim; Shavan Askar; Hussein, Diana Hayder Hussein
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4654

Abstract

This review investigates the transformation of deep learning in a fog computing environment, strongly emphasizing synergy between these enabled technologies and their real-world consequences across various domains. Fog computing is the decentralized approach to data processing, overcoming certain limitations in traditional cloud systems: it reduces latency up to 50%, minimizes bandwidth usage, and alleviates network congestion. Deep learning, known for pattern extraction from complex datasets, enhances real-time analytics and intelligent decision-making in resource-constrained environments. Together, they enable effective processing and prompt decision-making in applications such as anomaly detection in healthcare-for example, arrhythmias with 50% faster response, traffic flow optimization in smart cities, and predictive maintenance in industrial automation, reducing downtime by 60%. Integrating deep learning with fog computing has numerous advantages, such as reducing dependencies on cloud infrastructure, enhancing data privacy, and increasing real-time processing. Yet, several challenges remain, like the resource-limited computational capacity of fog nodes, security vulnerabilities, and the need for scalable and efficient architecture. Recent lightweight model design, federated learning techniques, and hierarchical frameworks are some promising solutions to such challenges. This review synthesizes the current research findings, identifies sector-specific applications, and addresses critical challenges. It also outlines future directions comprising the development of adaptive architectures, privacy-preserving methodologies, and hybrid approaches in artificial intelligence. Meeting these challenges will unlock the full potential of deep learning and fog computing-driving innovation and efficiency across industries.
Deep Learning for Dynamic Resource Management in 5G Networks: A Review Diana Hayder Hussein; Abdulwahab, Sara; Shavan Askar; Media Ali Ibrahim
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4688

Abstract

Dynamic resource management is important for 5G wireless networks to ensure they are efficient, scalable, and can handle growing connectivity demands while maintaining quality service. The aim of this review is to discuss how deep learning has changed the way complex challenges are being addressed in resource allocation, frequency spectrum management, energy efficiency, and runtime decision-making over 5G wireless networks. It combines the very best of leading-edge research insights into showing, through advanced deep learning techniques like supervised learning, and federated learning, how to allow for intelligent, adaptive solutions that go beyond conventional approaches. The manuscript describes this through a review that compares the strengths of these methodologies in network performance optimization while pointing out some limitations related to computational complexity or lack of extensive real-world testing. It further elaborates on promising future directions, ranging from federated learning for decentralized resource management to enhancing the interpretability of deep learning models and leveraging diverse datasets for improving robustness. The discussion also covers the arrival of 6G networks, which will introduce refined and AI-driven approaches for resource optimization. By establishing the logical links between theoretical developments and practical uses, the presented review will pinpoint the transforming potential of deep learning in re-shaping both the wireless communication networks of the future, but also opening new frontiers well beyond 5G.
Machine Learning for Network Anomaly Detection A Review Mahmood, Nawzad Hamad; Diana Hayder Hussein; Shavan Askar; Media Ali Ibrahim
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4703

Abstract

This research aims to investigate the application of machine learning (ML) techniques in network anomaly detection to enhance security in the face of evolving cyber threats. Employing a systematic review of existing literature and experimental evaluation, the study explores the effectiveness of various ML algorithms and their capacity to detect anomalies in network traffic. Unlike traditional rule-based methods, ML algorithms analyze extensive traffic data to distinguish normal from abnormal behavior, adapting dynamically to new threats in real-time. Key methodologies include feature engineering to optimize model performance, focusing on attributes like packet size and flow duration. The research evaluates detection accuracy, reduction of false positives, and the adaptability of ML-based systems to changing conditions. Main outcomes demonstrate that ML offers significant advantages over heuristic approaches, with improved detection rates, minimized human intervention, and enhanced responsiveness to emerging threats. The findings underscore the importance of real-time detection capabilities and highlight challenges such as computational complexity and dataset quality. By addressing these challenges, the study contributes valuable insights into strengthening network defense mechanisms through advanced ML applications.
Quality of Service (QoS) Optimization in 5G Using Machine Learning Diana Hayder Hussein; Mahmood, Nawzad Hamad; Shavan Askar; Media Ali Ibrahim
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4706

Abstract

The emergence of 5G networks has revolutionized communication systems by providing unprecedented speed, connectivity, and reliability. This breakthrough technology enables diverse applications such as autonomous vehicles, smart cities, and industrial automation through higher bandwidth and ultra-low latency. However, maintaining consistent Quality of Service (QoS) across these varied applications presents significant challenges due to their conflicting demands. Traditional QoS management methods struggle to address the dynamic and complex requirements of 5G, prompting the adoption of Machine Learning (ML) techniques. ML offers intelligent, adaptive solutions for traffic prediction, network slicing, and real-time decision-making, ensuring improved resource allocation and seamless service delivery.
Deep Learning Techniques for Network Security Yousif Mohammed Ismail; Diana Hayder Hussein; Shavan Askar; Media Ali Ibrahim
The Indonesian Journal of Computer Science Vol. 14 No. 1 (2025): The Indonesian Journal of Computer Science (IJCS)
Publisher : AI Society & STMIK Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33022/ijcs.v14i1.4737

Abstract

This article explores the seven outstanding deep-learning techniques used to enhance network security. It provides a comprehensive analysis of how these techniques address various cybersecurity challenges, including intrusion detection, malware classification, and anomaly detection. This review highlights the effectiveness of deep learning models such as Convolutional Neural Networks (Recurrent neural networks (RNNs) and automatic encoders used in processing large datasets and identifying complex patterns representing security threats. The article also discusses the advantages and limitations of each technique, emphasizing the importance of feature extraction, model training, and real-time processing capabilities. By combining the findings of the current research, this review aims to guide future research and practical implementation of deep learning in securing network infrastructure against evolving cyber threats. The review provided a comprehensive summary of the deep learning techniques used in network security, highlighting their strengths and limitations. The findings showed that deep learning has significant potential to improve detection and response to network threats, although challenges related to model interpretability, data quality, and computational efficiency should be addressed.