cover
Contact Name
Rahmat Hidayat
Contact Email
mr.rahmat@gmail.com
Phone
-
Journal Mail Official
rahmat@pnp.ac.id
Editorial Address
-
Location
Kota padang,
Sumatera barat
INDONESIA
JOIV : International Journal on Informatics Visualization
ISSN : 25499610     EISSN : 25499904     DOI : -
Core Subject : Science,
JOIV : International Journal on Informatics Visualization is an international peer-reviewed journal dedicated to interchange for the results of high quality research in all aspect of Computer Science, Computer Engineering, Information Technology and Visualization. The journal publishes state-of-art papers in fundamental theory, experiments and simulation, as well as applications, with a systematic proposed method, sufficient review on previous works, expanded discussion and concise conclusion. As our commitment to the advancement of science and technology, the JOIV follows the open access policy that allows the published articles freely available online without any subscription.
Arjuna Subject : -
Articles 55 Documents
Search results for , issue "Vol 8, No 2 (2024)" : 55 Documents clear
Denoising Ambulatory Electrocardiogram Signal Using Interval Dedependent Thresholds based Stationary Wavelet Transform Hermawan, Indra; Sevani, Nina; F. Abka, Achmad; Jatmiko, Wisnu
JOIV : International Journal on Informatics Visualization Vol 8, No 2 (2024)
Publisher : Society of Visual Informatics

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.62527/joiv.8.2.2428

Abstract

Noise contamination in electrocardiogram (ECG) monitoring systems can lead to errors in analysis and diagnosis, resulting in a high false alarm rate (FAR). Various studies have been conducted to reduce or eliminate noise in ECG signals. However, some noise characteristics overlap with the frequency range of ECG signals, which occur randomly and are transient. This results in shape alteration and amplitude reduction in P and R waves. The author proposed a framework for eliminating noise in ECG signals using the stationary wavelet transform method and interval-dependent thresholds (IDT) based on the change point detection method to address these challenges. The proposed framework decomposes the input electrocardiogram (ECG) signal at a specific level using the Stationary Wavelet Transform method, resulting in detail and approximation coefficients. Interval detection focuses on the initial detailed coefficient, d1, chosen due to its significant content of noise coefficients, especially high-frequency noise. Subsequently, threshold values are computed for each interval. Hard and soft thresholding processes are then applied individually to each interval. Finally, reconstruction occurs using the inverse stationary wavelet transform method on the threshold coefficient outcomes. Two measurement matrices, root mean square error (RMSE) and percentage root mean squared difference (PRD), were used to measure the performance of the proposed framework. In addition, the proposed framework was compared to stationary wavelet transform (SWT) and discrete wavelet transform (DWT). The test results showed that the proposed method outperforms DWT and SWT. The proposed framework obtained an average increase in RMSE scores of 18% and 45% compared to the SWT and DWT methods, respectively, and PRD values of 17% and 37% compared to the SWT and DWT methods, respectively. So, using IDT in the stationary wavelet transform method can improve the denoising performance. With the development of this new framework for denoising ECG signals, we hope it can become an alternative method for other researchers to utilize in denoising ECG signals.
Artificial Neural Network Accuracy Optimization Using Transfer Function Methods on Various Human Gait Walking Environments Indrawati, Ragil Tri; Putri, Farika Tono; Safriana, Eni; Isti Nugroho, Wahyu; Prawibowo, Hartanto; Ariyanto, Mochammad
JOIV : International Journal on Informatics Visualization Vol 8, No 2 (2024)
Publisher : Society of Visual Informatics

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.62527/joiv.8.2.2159

Abstract

A bionic leg with ergonomic functionality can increase the user’s independence. However, an ergonomic bionic leg can be challenged to be developed. One of its challenges is related to functionality, where the bionic leg motor can be rugged to adapt to the user. One of the solutions for the bionic leg challenge is the application of a motor driver controlled by the user’s muscle signal. EMG signal can be utilized as the user’s signal source. The EMG signal is then fed back into the motor device. EMG signals obtained during a natural walking environment can result in smooth and natural movement. This study classifies EMG signals into 8 classes: a controlled walking environment (treadmill walking with various speeds) and a natural walking environment (ground walking, upstairs and downstairs walking). This research aims to optimize the ANN method using transfer function variations. The best method will be used to train EMG-driven motors for future studies related to bionic legs. The best ANN parameter in this research using Levenberg-Marquardt backpropagation as a training algorithm with transfer function pairing of the exponential function: Hyperbolic tangent sigmoid transfer function and SoftMax transfer function with 98.8% accuracy and 0.036 MSE value. The best method from the experiment and ANN classification can be used as a training method for a bionic leg in future research.
Firefly Algorithm for SVM Multi-class Optimization on Soybean Land Suitability Analysis Nurkholis, Andi; Styawati, Styawati; Suhartanto, Alvi
JOIV : International Journal on Informatics Visualization Vol 8, No 2 (2024)
Publisher : Society of Visual Informatics

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.62527/joiv.8.2.1860

Abstract

Soybean is the primary source of vegetable protein nutrition, containing fat and vitamins that Indonesian people widely consume. The decline in soybean production in Indonesia every year is due to the reduced area of soybean cultivation, thereby increasing dependence on imports from other countries. Land suitability maps can provide directions for priority locations for soybean cultivation based on land characteristics and weather to produce optimal production. The SVM multi-class algorithm has been applied to classify land suitability data to create a land suitability map but has yet to obtain optimal accuracy, especially for sigmoid kernels. The objective of this study is to enhance the performance of the sigmoid kernel SVM by utilizing the firefly algorithm. The study focuses on evaluating the suitability of soybean cultivation in Bogor and Grobogan Regencies. The results of the tests indicate that the firefly algorithm-optimized SVM (FA-SVM) significantly improves accuracy compared to the SVM without optimization. The accuracy achieved by FA-SVM is 89.95%, while the SVM without optimization only achieves an accuracy of 65.99%. The best parameters produced by the firefly algorithm are C=2.33 and σ=0.45 obtained from firefly customization, and the number of generations is 10. Based on this, the optimization algorithm can be used to produce an optimal model. The best optimal model obtained can be used as a guide for priority locations/areas for soybean cultivation by farming communities, so as to produce maximum soybean productivity.
A Prototype of Decentralized Applications (DApps) Population Management System Based on Blockchain and Smart Contract Saian, Septovan Dwi Suputra; Sembiring, Irwan; Manongga, Daniel H. F.
JOIV : International Journal on Informatics Visualization Vol 8, No 2 (2024)
Publisher : Society of Visual Informatics

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.62527/joiv.8.2.1861

Abstract

The Indonesian population reached 270,20 million in 2020. Each resident is equipped with various secret identities. The COVID-19 pandemic has made all activities use technology as a basis, causing residents' identities to be stored digitally. Some applications that keep these identities experience data leaks. However, with the advent of Web3 and its emphasis on decentralization through blockchain, a new era of secure data management is possible. Blockchain, with its inherent security features, ensures that data stored is secure, difficult to damage or lose due to mutual consensus. Every transaction is recorded, making it easy to carry out the audit process. Therefore, this research will design and implement prototype dApps for secure population management, leveraging the superior security of blockchain technology. The initial stage of research is to conduct a literature study. Furthermore, it is to create designs such as system, infrastructure, and activity diagrams. Then do the development of the dApps prototype. The last is testing using OWASP ZAP and cost analysis. A dApps prototype was implemented on a blockchain. Every transaction is recorded and publicly viewable through the Etherscan platform. Other data stored on a blockchain have gone through an AES-256 encryption process with the data owner's account key so that the owner can only see the data. The results of the tests performed show that there is no high-level warning. The cost analysis results show that the most used costs are when deploying smart contracts and making new data. For further development, it is implementing permissionless blockchain and multi-accounts.
The E-govqual and Importance Performance Analysis (IPA) Models Analysis: Review a Web Service Quality of E-government Yuhefizar, Y.; Utami, Devi; Sudiman, Josephine
JOIV : International Journal on Informatics Visualization Vol 8, No 2 (2024)
Publisher : Society of Visual Informatics

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.62527/joiv.8.2.1196

Abstract

The e-government web portal serves as a crucial platform for providing information that can be easily and universally accessed. This serves as an intermediary between the municipal administration and the community, ultimately resulting in improved and streamlined public services. Several variables, including age and proficiency in using the integrated system, provide insights on how to gauge user satisfaction levels of the information system and its current users. The e-govqual and Importance Performance Analysis (IPA) models are accurate indicators of user satisfaction. This article seeks to understand the perception of users of the e-government web portal of the municipal Tanah Datar municipality. It aims to compare the servqual and the IPA model to determine the most suitable method for assessing public perceptions and identifying priority attributes to improve service quality. These two approaches share the same objective, but employ different methodologies. The user's perception of performance is designated as the independent variable (X) using a quantitative approach, while service quality expectations are designated as the dependent variable (Y). This is achieved by combining the Likert scale with five dimensions. This study uses questionnaires to gather data from 275 participants and uses two models, E-govqual and Importance Performance Analysis (IPA), to assess user satisfaction. The findings indicate that it is crucial for the government to respond quickly to user issues, provide feedback on user input, and regularly update the material on the Web portal.
Rainfall-Runoff Modeling Using Artificial Neural Network for Batu Pahat River Basin Zulkiflee, Nurul Najihah; Mohd Safar, Noor Zuraidin; Kamaludin, Hazalila; Jofri, Muhamad Hanif; Kamarudin, Noraziahtulhidayu; Rasyidah, -
JOIV : International Journal on Informatics Visualization Vol 8, No 2 (2024)
Publisher : Society of Visual Informatics

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.62527/joiv.8.2.2704

Abstract

This research delves into the effectiveness of Artificial Neural Networks with Multilayer Perceptron (ANN-MLP) and Nonlinear AutoRegressive with eXogenous inputs (NARX) models in predicting short-term rainfall-runoff patterns in the Batu Pahat River Basin. This study aims to predict river water levels using historical rainfall and river level data for future intervals of 1, 3, and 6 hours. Data preprocessing techniques, including the management of missing values, identification of outliers, and reduction of noise, were applied to enhance the accuracy and dependability of the models. This study assessed the performance of the models for ANN-MLP and NARX by comparing their effectiveness across various forecast timeframes and evaluating their performance in different scenarios. The findings of the study revealed that the ANN-MLP model showed robust performance in short-term prediction. On the contrary, the NARX model exhibited higher accuracy, particularly in capturing intricate temporal relationships and external impacts on river behavior. The ANN-MLP produces 99% accuracy for 1-hour prediction, and NARX yields 98% accuracy with 0.3245 Root Mean Squared Error and 0.1967 Mean Absolute Error. This study makes a valuable contribution to hydrological forecasting by presenting a rigorous and precise modeling methodology.
Classification of Human Concentration Levels Based on Electroencephalography Signals Siregar, Baihaqi; Florence, Grace; Seniman, Seniman; Fahmi, Fahmi; Mubarakah, Naemah
JOIV : International Journal on Informatics Visualization Vol 8, No 2 (2024)
Publisher : Society of Visual Informatics

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.62527/joiv.8.2.2045

Abstract

Concentration denotes the capability to direct one's attention to a specific subject matter. Presently, within the era characterized by an overwhelming abundance of information inundating human existence, distractions frequently impede human concentration, thereby influencing the depth of knowledge acquisition. Various elements contribute to the decline in human concentration, including diminished metabolic states, inadequate sleep, and engaging in multiple tasks simultaneously. The cognitive state of an individual during the process of thinking can be assessed through the analysis of electroencephalography signals. The primary objective of this investigation is to facilitate experts' interpretation of electroencephalography signal outcomes for categorizing concentration levels. The dataset utilized in this examination comprises unprocessed EEG data obtained from observing individuals in both relaxation and concentration states. After data preprocessing, feature extraction is executed, and classification is performed using the Support Vector Machine technique. The outcome of this study reveals an accuracy rate of 84%. These developments allow for continual monitoring of brain function, an enhanced comprehension of cerebral activities, and increased operational efficacy of end-effectors. The implications of these advancements on prospective research opportunities are evident in the potential for more accurate diagnosis of neurological disorders and the progression of sophisticated BCI applications designed to support healthcare and monitor cognitive states. The evolution of EEG technology is paving the way for novel research pathways in neuroscience and human-computer interaction.
A Review of Fog Computing Concept, Architecture, Application, Parameters and Challenges Ghani, Naveed; Bakar Sajak, Aznida Abu; Qureshi, Rehan; Azril Zuhairi, Megat Farez; Ahmad Baidowi, Zaid Mujaiyid Putra
JOIV : International Journal on Informatics Visualization Vol 8, No 2 (2024)
Publisher : Society of Visual Informatics

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.62527/joiv.8.2.2187

Abstract

The Internet of Things (IoT) has become an integral part of our daily lives, growing exponentially from a facility to a necessity. IoT has been utilized extensively through cloud computing and has proven an excellent technology for deploying in various fields. The data generated by the IoT devices gets transmitted to the cloud for processing and storage. However, with this approach, there are specific issues like latency, energy, computation resources availability, bandwidth, heterogeneity, storage, and network failure. To overcome these obstacles, fog computing is utilized as a middle tier. Fog computing gathers and processes the generated data closer to the user end before transmitting it to the cloud. This paper aims to conduct a structured review of the current state of fog computing and its architectures deployed across multiple industries. This paper also focuses on the implementation and critical parameters for introducing fog computing in IoT-cloud architecture. A detailed comparative analysis has been carried out for 5 different architectures considering various crucial parameters to identify how the quality of service and quality of experience for end users can be optimized. Finally, this paper looks at the multiple challenges that fog computing faces in a structured six-level approach. These challenges will also lead the way for future research in resource management, green computing, and security.
Features, Analysis Techniques, and Detection Methods of Cryptojacking Malware: A Survey Kadhum, Laith M; Firdaus, Ahmad; Hisham, Syifak Izhar; Mushtaq, Waheed; Ab Razak, Mohd Faizal
JOIV : International Journal on Informatics Visualization Vol 8, No 2 (2024)
Publisher : Society of Visual Informatics

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.62527/joiv.8.2.2725

Abstract

Various types of malwares are capable of bringing harm to users. The list of types are root exploits, botnets, trojans, spyware, worms, viruses, ransomware, and cryptojacking. Cryptojacking is a significant proportion of cyberattacks in which exploiters mine cryptocurrencies using the victim’s devices, for instance, smartphones, tablets, servers, or computers. It is also defined as the illegal utilization of victim resources (CPU, RAM, and GPU) to mine cryptocurrencies without detection. The purpose of cryptojacking, along with numerous other forms of cybercrime, is monetary gain. Furthermore, it also intended to stay concealed from the victim's viewpoint. Following this crime, to the author's knowledge, a paper focusing solely on a review of cryptojacking research is still unavailable. This paper presents cryptojacking detection information to address this deficiency, including methods, detection, analysis techniques, and features. As cryptojacking malware is a type that executes its activities using the network, most of the analysis and features fall into dynamic activities. However, static analysis is also included in the security researcher’s option. The codes that are involved are opcode and JavaScript. This demonstrates that these two languages are vital programming languages to focus on to detect cryptojacking. Moreover, the researchers also begin to adopt deep learning in their experiments to detect cryptojacking malware. This paper also examines potential future developments in the detection of cryptojacking.
A Deep Learning-based Fault Detection and Classification in Smart Electrical Power Transmission System Khaleefah, Shihab Hamad; A. Mostafa, Salama; Gunasekaran, Saraswathy Shamini; Khattak, Umar Farooq; Yaacob, Siti Salwani; Alanda, Alde
JOIV : International Journal on Informatics Visualization Vol 8, No 2 (2024)
Publisher : Society of Visual Informatics

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.62527/joiv.8.2.2701

Abstract

Progressively, the energy demands and responsibilities to control the demands have expanded dramatically. Subsequently, various solutions have been introduced, including producing high-capacity electrical generating power plants, and applying the grid concept to synchronize the electrical power plants in geographically scattered grids. Electrical Power Transmission Networks (EPTN) are made of many complex, dynamic, and interrelated components. The transmission lines are essential components of the EPTN, and their fundamental duty is to transport electricity from the source area to the distribution network. These components, among others, are continually prone to electrical disturbance or failure. Hence, the EPTN required fault detection and activation of protective mechanisms in the shortest time possible to preserve stability. This research focuses on using a deep learning approach for early fault detection to improve the stability of the EPTN. Early fault detection swiftly identifies and isolates faults, preventing cascading failures and enabling rapid corrective actions. This ensures the resilience and reliability of the grid, optimizing its operation even in the face of disruptions. The design of the deep learning approach comprises a long-term and short-term memory (LSTM) model. The LSTM model is trained on an electrical fault detection dataset that contains three-phase currents and voltages at one end serving as inputs and fault detection as outputs. The proposed LSTM model has attained an accuracy of 99.65 percent with an error rate of just 1.17 percent and outperforms neural network (NN) and convolutional neural network (CNN) models.