Myyara, Marouane
Unknown Affiliation

Published : 2 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 2 Documents
Search

A new approach based on genetic algorithm for computation offloading optimization in multi-access edge computing networks Myyara, Marouane; Lagnfdi, Oussama; Darif, Anouar; Farchane, Abderrazak
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 13, No 4: December 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijai.v13.i4.pp4186-4194

Abstract

The proliferation of smart devices and the increasing demand for resource-intensive applications present significant challenges in terms of computational efficiency, leading to surge in data traffic. While cloud computing offers partial solutions, its centralized architecture raises concerns about latency. Multi-access edge computing (MEC) emerges as promising alternative by deploying servers at the network edge to bring computations closer to user devices. However, optimizing computation offloading in the dynamic MEC environment remains a complex challenge. This paper introduces novel genetic algorithm-based approach for efficient computation offloading in MEC, considering processing and transmission delays, user preferences, and system constraints. The proposed approach integrates computation offloading and resource allocation algorithm based on evolutionary principles, combined with a greedy strategy to maximize overall system performance. By utilizing genetic algorithms, the proposed method enables dynamic adaptation to changing conditions, eliminating the need for intricate mathematical models and providing an appealing solution to the complexities inherent in MEC. The urgency of this research arises from the critical need to enhance mobile application performance. Simulation results demonstrate the robustness and efficacy of our approach in achieving near-optimal solutions while efficiently balancing computation offloading, minimizing latency, and maximizing resource utilization. Our approach offers flexibility and adaptability, contributing to advancement of MEC networks and addressing the requirements of latency-sensitive applications.
A new hybrid model based on machine learning and fuzzy logic for QoS enhancing in IoT Lagnfdi, Oussama; Myyara, Marouane; Darif, Anouar
Indonesian Journal of Electrical Engineering and Computer Science Vol 41, No 2: February 2026
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v41.i2.pp624-632

Abstract

The fast expansion of internet of things (IoT) devices presents a more complicated scenario for maintaining a stable quality of service (QoS), which would guarantee the network’s dependable operation. The emergence of increasingly complex applications that call for additional devices makes this even more crucial. Adaptive intelligence solutions that guarantee optimal network behavior are therefore required. This paper presents a hybrid optimized solution for a three-layer IoT network that models the application, network, and perception layers of an IoT network using machine learning and fuzzy logic (FL). This method guarantees optimal QoS prediction with improved network adaptability by using fuzzy membership parameters. When the number of devices increases from 100 to 1,500, FLGA maintains an average QoS of 95% to 87%, while FL maintains 84% and RANDOM maintains 79%. At the application level, genetic algorithm (GA) continues to outperform RANDOM by 15.57% and FL by 6.32%. The goal of this paper is to provide a solid network solution that could enhance the consistency of QoS performance in order to combat the increasingly complex scenario of an IoT network.