Jâafar Abouchabaka
Ibn Tofail University

Published : 6 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 6 Documents
Search

Parallel genetic approach for routing optimization in large ad hoc networks Hala Khankhour; Otman Abdoun; Jâafar Abouchabaka
International Journal of Electrical and Computer Engineering (IJECE) Vol 12, No 1: February 2022
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v12i1.pp748-755

Abstract

This article presents a new approach of integrating parallelism into the genetic algorithm (GA), to solve the problem of routing in a large ad hoc network, the goal is to find the shortest path routing. Firstly, we fix the source and destination, and we use the variable-length chromosomes (routes) and their genes (nodes), in our work we have answered the following question: what is the better solution to find the shortest path: the sequential or parallel method?. All modern systems support simultaneous processes and threads, processes are instances of programs that generally run independently, for example, if you start a program, the operating system spawns a new process that runs parallel elements to other programs, within these processes, we can use threads to execute code simultaneously. Therefore, we can make the most of the available central processing unit (CPU) cores. Furthermore, the obtained results showed that our algorithm gives a much better quality of solutions. Thereafter, we propose an example of a network with 40 nodes, to study the difference between the sequential and parallel methods, then we increased the number of sensors to 100 nodes, to solve the problem of the shortest path in a large ad hoc network.
An intelligent irrigation system based on internet of things (IoT) to minimize water loss Samar Amassmir; Said Tkatek; Otman Abdoun; Jaafar Abouchabaka
Indonesian Journal of Electrical Engineering and Computer Science Vol 25, No 1: January 2022
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v25.i1.pp504-510

Abstract

This paper proposes a comparison of three machine learning algorithms for a better intelligent irrigation system based on internet of things (IoT) for differents products. This work's major contribution is to specify the most accurate algorithm among the three machine learning algorithms (k-nearest neighbors (KNN), support vector machine (SVM), artificial neural network (ANN)). This is achieved by collecting irrigation data of a specific products and split it into training data and test data then compare the accuracy of the three algorithms. To evaluate the performance of our algorithm we built a system of IoT devices. The temperature and humidity sensors are installed in the field interact with the Arduino microcontroller. The Arduino is connected to Raspberry Pi3, which holds the machine learning algorithm. It turned out to be ANN algorithm is the most accurate for such system of irrigation. The ANN algorithm is the best choice for an intelligent system to minimize water loss for some products.
Intelligent system for recruitment decision making using an alternative parallel-sequential genetic algorithm Said Tkatek; Saadia Bahti; Otman Abdoun; Jaafar Abouchabaka
Indonesian Journal of Electrical Engineering and Computer Science Vol 22, No 1: April 2021
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v22.i1.pp385-395

Abstract

The human resources (HR) manager needs effective tools to be able to move away from traditional recruitment processes to make the good decision to select the good candidates for the good posts. To do this, we deliver an intelligent recruitment decision-making method for HR, incorporating a recruitment model based on the multipack model known as the NP-hard model. The system, which is a decision support tool, often integrates a genetic approach that operates alternately in parallel and sequentially. This approach will provide the best recruiting solution to allow HR managers to make the right decision to ensure the best possible compatibility with the desired objectives. Operationally, this system can also predict the altered choice of parallel genetic algorithm (PGA) or sequential genetic algorithm (SeqGA) depending on the size of the instance and constraints of the recruiting posts to produce the quality solution in a reduced CPU time for recruiting decision-making. The results obtained in various tests confirm the performance of this intelligent system which can be used as a decision support tool for intelligently optimized recruitment.
Predictive fertilization models for potato crops using machine learning techniques in Moroccan Gharb region Said Tkatek; Samar Amassmir; Amine Belmzoukia; Jaafar Abouchabaka
International Journal of Electrical and Computer Engineering (IJECE) Vol 13, No 5: October 2023
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v13i5.pp5942-5950

Abstract

Given the influence of several factors, including weather, soils, land management, genotypes, and the severity of pests and diseases, prescribing adequate nutrient levels is difficult. A potato’s performance can be predicted using machine learning techniques in cases when there is enough data. This study aimed to develop a highly precise model for determining the optimal levels of nitrogen, phosphorus, and potassium required to achieve both high-quality and high-yield potato crops, taking into account the impact of various environmental factors such as weather, soil type, and land management practices. We used 900 field experiments from Kaggle as part of a data set. We developed, evaluated, and compared prediction models of k-nearest neighbor (KNN), linear support vector machine (SVM), naive Bayes (NB) classifier, decision tree (DT) regressor, random forest (RF) regressor, and eXtreme gradient boosting (XGBoost). We used measures such as mean average error (MAE), mean squared error (MSE), R-Squared (RS), and R2Root mean squared error (RMSE) to describe the model’s mistakes and prediction capacity. It turned out that the XGBoost model has the greatest R2, MSE and MAE values. Overall, the XGBoost model outperforms the other machine learning models. In the end, we suggested a hardware implementation to help farmers in the field.
Adaptive traffic lights based on traffic flow prediction using machine learning models Idriss Moumen; Jaafar Abouchabaka; Najat Rafalia
International Journal of Electrical and Computer Engineering (IJECE) Vol 13, No 5: October 2023
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v13i5.pp5813-5823

Abstract

Traffic congestion prediction is one of the essential components of intelligent transport systems (ITS). This is due to the rapid growth of population and, consequently, the high number of vehicles in cities. Nowadays, the problem of traffic congestion attracts more and more attention from researchers in the field of ITS. Traffic congestion can be predicted in advance by analyzing traffic flow data. In this article, we used machine learning algorithms such as linear regression, random forest regressor, decision tree regressor, gradient boosting regressor, and K-neighbor regressor to predict traffic flow and reduce traffic congestion at intersections. We used the public roads dataset from the UK national road traffic to test our models. All machine learning algorithms obtained good performance metrics, indicating that they are valid for implementation in smart traffic light systems. Next, we implemented an adaptive traffic light system based on a random forest regressor model, which adjusts the timing of green and red lights depending on the road width, traffic density, types of vehicles, and expected traffic. Simulations of the proposed system show a 30.8% reduction in traffic congestion, thus justifying its effectiveness and the interest of deploying it to regulate the signaling problem in intersections.
Enhancing Hadoop distributed storage efficiency using multi-agent systems Rabie Mahdaoui; Manar Sais; Jaafar Abouchabaka; Najat Rafalia
Indonesian Journal of Electrical Engineering and Computer Science Vol 34, No 3: June 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v34.i3.pp1814-1822

Abstract

Distributed storage systems play a pivotal role in modern data-intensive applications, with Hadoop distributed file system (HDFS) being a prominent example. However, optimizing the efficiency of such systems remains a complex challenge. This research paper presents a novel approach to enhance the efficiency of distributed storage by leveraging multi-agent systems (MAS). Our research is centered on enhancing the efficiency of the HDFS by incorporating intelligent agents that can dynamically assign storage tasks to nodes based on their performance characteristics. Utilizing a decentralized decision-making framework, the suggested approach based on MAS considers the real-time performance of nodes and allocates storage tasks adaptively. This strategy aims to alleviate performance bottlenecks and minimize data transfer latency. Through extensive experimental evaluation, we demonstrate the effectiveness of our approach in improving HDFS performance in terms of data storage, retrieval, and overall system efficiency. The results reveal significant reductions in job execution times and enhanced resource utilization, there by offering a promising avenue for enhancing the efficiency of distributed storage systems.