cover
Contact Name
-
Contact Email
-
Phone
-
Journal Mail Official
-
Editorial Address
-
Location
Kota yogyakarta,
Daerah istimewa yogyakarta
INDONESIA
International Journal of Informatics and Communication Technology (IJ-ICT)
ISSN : 22528776     EISSN : 27222616     DOI : -
Core Subject : Science,
International Journal of Informatics and Communication Technology (IJ-ICT) is a common platform for publishing quality research paper as well as other intellectual outputs. This Journal is published by Institute of Advanced Engineering and Science (IAES) whose aims is to promote the dissemination of scientific knowledge and technology on the Information and Communication Technology areas, in front of international audience of scientific community, to encourage the progress and innovation of the technology for human life and also to be a best platform for proliferation of ideas and thought for all scientists, regardless of their locations or nationalities. The journal covers all areas of Informatics and Communication Technology (ICT) focuses on integrating hardware and software solutions for the storage, retrieval, sharing and manipulation management, analysis, visualization, interpretation and it applications for human services programs and practices, publishing refereed original research articles and technical notes. It is designed to serve researchers, developers, managers, strategic planners, graduate students and others interested in state-of-the art research activities in ICT.
Arjuna Subject : -
Articles 462 Documents
Indonesian automated short-answer grading using transformers-based semantic similarity Situmeang, Samuel; Tambunan, Sarah Rosdiana; Ginting, Lidia; Simamora, Wahyu Krisdangolyanti; ButarButar, Winda Sari
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 14, No 3: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v14i3.pp1034-1043

Abstract

Automatic short answer grading (ASAG) systems offer a promising solution for improving the efficiency of reading literacy assessments. While promising, current Indonesian artificial intelligence (AI) grading systems still have room for improvement, especially when dealing with different domains. This study explores the effectiveness of large language models, specifically bidirectional encoder representations from transformers (BERT) variants, in conjunction with traditional hand-engineered features, to improve ASAG accuracy. We conducted experiments using various BERT models, hand-engineered features, text pre-processing techniques, and dimensionality reduction. Our findings show that BERT models consistently outperform traditional methods like term frequency-inverse document frequency (TF-IDF). IndoBERTLite-Base-P2 achieved the highest quadratic weighted kappa (QWK) score among the BERT variants. Integrating handengineered features with BERT resulted in a substantial enhancement of the QWK score. Utilizing comprehensive text pre-processing is a critical factor in achieving optimal performance. In addition, dimensionality reduction should be carefully used because it potentially removes semantic information.
Leveraging IoT with LoRa and AI for predictive healthcare analytics Lavanya, Pillalamarri; Venkatachalam, Selvakumar; Subba Reddy, Immareddy Venkata
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 14, No 3: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v14i3.pp1156-1162

Abstract

Progress in mobile technology, the internet, cloud computing, digital platforms, and social media has substantially facilitated interpersonal connections following the COVID-19 pandemic. As individuals increasingly prioritise health, there is an escalating desire for novel methods to assess health and well-being. This study presents an internet of things (IoT)-based system for remote monitoring utilizing a long range (LoRa), a low-cost and LoRa wireless network for the early identification of health issues in home healthcare environments. The project has three primary components: transmitter, receiver, and alarm systems. The transmission segment captures data via sensors and transmits it to the reception segment, which then uploads it to the cloud. Additionally, machine learning (ML) methods, including convolutional neural networks (CNN), artificial neural networks (ANN), Naïve Bayes (NB), and long short-term memory (LSTM), were utilized on the acquired data to forecast heart rate, blood oxygen levels, body temperature patterns. The forecasting models are trained and evaluated using data from various health parameters from five diverse persons to ascertain the architecture that exhibits optimal performance in modeling and predicting dynamics of different medical parameters. The models' accuracy was assessed using mean absolute error (MAE) and root mean square error (RMSE) measures. Although the models performed similarly, the ANN model outperformed them in all conditions.
Quantifying the severity of cyber attack patterns using complex networks Hasan, Ahmed Salih; Mohammed, Yasir F.; Mahmood, Basim
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 14, No 3: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v14i3.pp1179-1188

Abstract

This work quantifies the severity and likelihood of cyberattacks using complex network modelling. A dataset from common attack pattern enumerations and classifications (CAPEC) is collected and formalized as nodes and edges aiming at creating a network model. In this model, each attack pattern is represented as a node, and an edge is created between two nodes when there is a relation between them. The dataset includes 559 attack patterns and 1921 relations among them. Network metrics are used to perform the analysis on the network level and node level. Moreover, a rank of the CAPECs based on a complex network perspective is generated. This rank is compared with the CAPEC ranking system and deeply discussed based on cybersecurity perspective. The findings show interesting facts about the likelihood and severity of attacks. It is found that the network perspective should be given attention by the CAPEC ranking system. Finally, the results of this work can be of high interest to security architects.
Unit commitment problem solved with adaptive particle swarm optimization Muthu, Ramesh Babu; Chandrasekaran, Venkatesh Kumar; Munusamy, Bharathraj; Sankaranarayanan, Dashagireevan
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 14, No 3: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v14i3.pp783-790

Abstract

This article presents an innovative approach that solves the problem of generation scheduling by supplying all possible operating states for generating units for the given time schedule over the day. The scheduling variables are set up to code the load demand as an integer each day. The proposed adaptive particle swarm optimization (APSO) technique is used to solve the generation scheduling issue by a method of optimization considering production as well as transitory costs. The system and generator constraints are considered when solving the problem, which includes minimum and maximum uptime and downtime as well as the amount of energy produced by each producing unit (like capacity reserves). This paper describes the suggested algorithm that can be applied for unit commitment problems with wind and heat units. Test systems with 26 and 10 units are used to validate the suggested algorithm.
Shellcode classification analysis with binary classification-based machine learning Semendawai, Jaka Naufal; Stiawan, Deris; Anto Saputra, Iwan Pahendra; Shenify, Mohamed; Budiarto, Rahmat
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 14, No 3: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v14i3.pp923-932

Abstract

The internet enables people to connect through their devices. While it offers numerous benefits, it also has adverse effects. A prime example is malware, which can damage or even destroy a device or harm its users, highlighting the importance of cyber security. Various methods can be employed to prevent or detect malware, including machine learning techniques. The experiments are based on training and testing data from the UNSW_NB15 dataset. K-nearest neighbor (KNN), decision tree, and Naïve Bayes classifiers determine whether a record in the test data represents a Shellcode attack or a non-Shellcode attack. The KNN, decision tree, and Naïve Bayes classifiers reached accuracy rates of 96.26%, 97.19%, and 57.57%, respectively. This study's findings aim to offer valuable insights into the application of machine learning to detect or classify malware and other forms of cyberattacks.
Revolutionizing human activity recognition with prophet algorithm and deep learning Dhage, Jaykumar S.; Gulve, Avinash K.
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 14, No 3: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v14i3.pp1108-1118

Abstract

Various industries, such as healthcare and surveillance, depend heavily on the ability to recognize human activity. The “human activity recognition (HAR) using smartphones data set” can be found in the UCI online repository and includes accelerometer and gyroscope readings recorded during a variety of human activities. The accelerometer and gyroscope signals are also subjected to a band-pass filter to eliminate unwanted frequencies and background noise. This method effectively decreases the dimensionality of the feature space while improving the model's accuracy and efficiency. “Convolutional neural networks (CNNs)” and “long shortterm memory (LSTM)” networks are combined to create pyramidal dilated convolutional memory network (PDCMN), which is the final proposal. Results from experiments demonstrate the effectiveness and reliability of the suggested method, demonstrating its potential for precise and effective HAR in actuality schemes.
Comparative analysis of u-net architectures and variants for hand gesture segmentation in parkinson’s patients Telepatil, Avadhoot Ramgonda; Vaddin, Jayashree Sathyanarayana
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 14, No 3: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v14i3.pp972-982

Abstract

U-Net is a well-known method for image segmentation, and has proven effective for a variety of segmentation challenges. A deep learning architecture for segmenting hand gestures in parkinson’s disease is explored in this paper. We prepared and compared four custom models: a simple U-Net, a three-layer U-Net, an auto encoder-decoder architecture, and a U-Net with dense skip pathways, using a custom dataset of 1,000 hand gesture images and their corresponding masks. Our primary goal was to achieve accurate segmentation of parkinsonian hand gestures, which is crucial for automated diagnosis and monitoring in healthcare. Using metrics including accuracy, precision, recall, intersection over union (IoU), and dice score, we demonstrated that our architectures were effective in delineating hand gestures under different conditions. We also compared the performance of our custom models against pretrained deep learning architectures such as ResNet and VGGNet. Our findings indicate that the custom models effectively address the segmentation task, showcasing promising potential for practical applications in medical diagnostics and healthcare. This work highlights the versatility of our architectures in tackling the unique segmentation challenges associated with parkinson’s disease research and clinical practice.
Electric load forecasting using ARIMA model for time series data Belshanth, Balasubramanian; Prasad, Haran; Sudhakar, Thirumalaivasal Devanathan
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 14, No 3: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v14i3.pp830-836

Abstract

Any country's economic progress is heavily reliant on its power infrastructure, network, and availability, as energy has become an essential component of daily living in today's globe. Electricity's distinctive quality is that it cannot be stored in huge quantities, which explains why global demand for home and commercial electricity has grown at an astonishing rate. On the other hand, electricity costs have varied in recent years, and there is insufficient electricity output to meet global and local demand. The solution is a series of case studies designed to forecast future residential and commercial electricity demand so that power producers, transformers, distributors, and suppliers may efficiently plan and encourage energy savings for consumers. However, load prognosticasting has been one of the most difficult issues confronting the energy business since the inception of electricity. This study covers a new one–dimensional approach algorithm that is essential for the creation of a short–term load prognosticasting module for distribution system design and operation. It has numerous operations, including energy purchase, generation, and infrastructure construction. We have numerous time series forecasting methods of which autoregressive integrated moving average (ARIMA) outperforms the others. The auto–regressive integrated moving average model, or ARIMA, outperforms all other techniques for load forecasting.
Parameter-optimized routing protocols for targeted broadcast messages in smart campus environments Al-Sofy, Karam Mheide; Jalal, Jalal Khalid; Fadhil, Fajer F.; Mahmood, Basim
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 14, No 3: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v14i3.pp1056-1071

Abstract

The spread of handheld mobile devices integrated with multiple sensors makes it easy for these devices to interact with each other. These interactions are useful in a variety of applications such as monitoring and notification systems that can be adopted in smart campuses. The performance of these applications depends primarily on the network infrastructure and network protocols. In cases of failure, smart campus requires the provision of effective alternatives that can handle essential services. Hence, this work uses the Wi-Fi mobile ad hoc network (MANET) as an alternative backup to the traditional infrastructure. The dynamic nature of such a network relies on individuals' mobility, this leads to a lack of end-to-end connectivity. To overcome this challenge, delay-tolerant networking (DTN) has been adopted as its primary approach to routing information inside campus. Spray and wait, binary spray and wait (BSW), and probabilistic flooding protocols are deeply assessed to ensure sustained communications in the working environment. The protocols’ parameters are comprehensively investigated and optimized. Moreover, the performance metrics that are used in the evaluation are messages consumption, node responsiveness, and coverage. The findings showed that the optimal protocol and its parameters is reliant upon the specific application and resources available.
AI-based federated learning for heart disease prediction: a collaborative and privacy-preserving approach Bhatt, Stuti; Salkuti, Surender Reddy; Kim, Seong-Cheol
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 14, No 3: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v14i3.pp751-759

Abstract

People with symptoms like diabetes, high BP, and high cholesterol are at an increased risk for heart disease and stroke as they get older. To mitigate this threat, predictive fashions leveraging machine learning (ML) and artificial intelligence (AI) have emerged as a precious gear; however, heart disease prediction is a complicated task, and diagnosis outcomes are hardly ever accurate. Currently, the existing ML tech says it is necessary to have data in certain centralized locations to detect heart disease, as data can be found centrally and is easily accessible. This review introduces federated learning (FL) to answer data privacy challenges in heart disease prediction. FL, a collaborative technique pioneered by Google, trains algorithms across independent sessions using local datasets. This paper investigates recent ML methods and databases for predicting cardiovascular disease (heart attack). Previous research explores algorithms like region-based convolutional neural network (RCNN), convolutional neural network (CNN), and federated logistic regressions (FLRs) for heart and other disease prediction. FL allows the training of a collaborative model while keeping patient info spread out among various sites, ensuring privacy and security. This paper explores the efficacy of FL, a collaborative technique, in enhancing the accuracy of cardiovascular disease (CVD) prediction models while preserving data privacy across distributed datasets.