Jâafar Abouchabaka
Ibn Tofail University

Published : 6 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 3 Documents
Search
Journal : International Journal of Electrical and Computer Engineering

Parallel genetic approach for routing optimization in large ad hoc networks Hala Khankhour; Otman Abdoun; Jâafar Abouchabaka
International Journal of Electrical and Computer Engineering (IJECE) Vol 12, No 1: February 2022
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v12i1.pp748-755

Abstract

This article presents a new approach of integrating parallelism into the genetic algorithm (GA), to solve the problem of routing in a large ad hoc network, the goal is to find the shortest path routing. Firstly, we fix the source and destination, and we use the variable-length chromosomes (routes) and their genes (nodes), in our work we have answered the following question: what is the better solution to find the shortest path: the sequential or parallel method?. All modern systems support simultaneous processes and threads, processes are instances of programs that generally run independently, for example, if you start a program, the operating system spawns a new process that runs parallel elements to other programs, within these processes, we can use threads to execute code simultaneously. Therefore, we can make the most of the available central processing unit (CPU) cores. Furthermore, the obtained results showed that our algorithm gives a much better quality of solutions. Thereafter, we propose an example of a network with 40 nodes, to study the difference between the sequential and parallel methods, then we increased the number of sensors to 100 nodes, to solve the problem of the shortest path in a large ad hoc network.
Predictive fertilization models for potato crops using machine learning techniques in Moroccan Gharb region Said Tkatek; Samar Amassmir; Amine Belmzoukia; Jaafar Abouchabaka
International Journal of Electrical and Computer Engineering (IJECE) Vol 13, No 5: October 2023
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v13i5.pp5942-5950

Abstract

Given the influence of several factors, including weather, soils, land management, genotypes, and the severity of pests and diseases, prescribing adequate nutrient levels is difficult. A potato’s performance can be predicted using machine learning techniques in cases when there is enough data. This study aimed to develop a highly precise model for determining the optimal levels of nitrogen, phosphorus, and potassium required to achieve both high-quality and high-yield potato crops, taking into account the impact of various environmental factors such as weather, soil type, and land management practices. We used 900 field experiments from Kaggle as part of a data set. We developed, evaluated, and compared prediction models of k-nearest neighbor (KNN), linear support vector machine (SVM), naive Bayes (NB) classifier, decision tree (DT) regressor, random forest (RF) regressor, and eXtreme gradient boosting (XGBoost). We used measures such as mean average error (MAE), mean squared error (MSE), R-Squared (RS), and R2Root mean squared error (RMSE) to describe the model’s mistakes and prediction capacity. It turned out that the XGBoost model has the greatest R2, MSE and MAE values. Overall, the XGBoost model outperforms the other machine learning models. In the end, we suggested a hardware implementation to help farmers in the field.
Adaptive traffic lights based on traffic flow prediction using machine learning models Idriss Moumen; Jaafar Abouchabaka; Najat Rafalia
International Journal of Electrical and Computer Engineering (IJECE) Vol 13, No 5: October 2023
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v13i5.pp5813-5823

Abstract

Traffic congestion prediction is one of the essential components of intelligent transport systems (ITS). This is due to the rapid growth of population and, consequently, the high number of vehicles in cities. Nowadays, the problem of traffic congestion attracts more and more attention from researchers in the field of ITS. Traffic congestion can be predicted in advance by analyzing traffic flow data. In this article, we used machine learning algorithms such as linear regression, random forest regressor, decision tree regressor, gradient boosting regressor, and K-neighbor regressor to predict traffic flow and reduce traffic congestion at intersections. We used the public roads dataset from the UK national road traffic to test our models. All machine learning algorithms obtained good performance metrics, indicating that they are valid for implementation in smart traffic light systems. Next, we implemented an adaptive traffic light system based on a random forest regressor model, which adjusts the timing of green and red lights depending on the road width, traffic density, types of vehicles, and expected traffic. Simulations of the proposed system show a 30.8% reduction in traffic congestion, thus justifying its effectiveness and the interest of deploying it to regulate the signaling problem in intersections.