cover
Contact Name
Dahlan Abdullah
Contact Email
dahlan@unimal.ac.id
Phone
+62811672332
Journal Mail Official
ijestyjournal@gmail.com
Editorial Address
Jl. Tgk. Chik Ditiro, Lancang Garam, Lhokseumawe, Aceh - Indonesia, 24351
Location
Kota lhokseumawe,
Aceh
INDONESIA
International Journal of Engineering, Science and Information Technology
ISSN : -     EISSN : 27752674     DOI : -
The journal covers all aspects of applied engineering, applied Science and information technology, that is: Engineering: Energy Mechanical Engineering Computing and Artificial Intelligence Applied Biosciences and Bioengineering Environmental and Sustainable Science and Technology Quantum Science and Technology Applied Physics Earth Sciences and Geography Civil Engineering Electrical, Electronics and Communications Engineering Robotics and Automation Marine Engineering Aerospace Science and Engineering Architecture Chemical & Process Structural, Geological & Mining Engineering Industrial Mechanical & Materials Science: Bioscience & Biotechnology Chemistry Food Technology Applied Biosciences and Bioengineering Environmental Health Science Mathematics Statistics Applied Physics Biology Pharmaceutical Science Information Technology: Artificial Intelligence Computer Science Computer Network Data Mining Web Language Programming E-Learning & Multimedia Information System Internet & Mobile Computing Database Data Warehouse Big Data Machine Learning Operating System Algorithm Computer Architecture Computer Security Embedded system Coud Computing Internet of Thing Robotics Computer Hardware Information System Geographical Information System Virtual Reality, Augmented Reality Multimedia Computer Vision Computer Graphics Pattern & Speech Recognition Image processing ICT interaction with society, ICT application in social science, ICT as a social research tool, ICT in education
Articles 80 Documents
Search results for , issue "Vol 5, No 1 (2025)" : 80 Documents clear
Multi-Hop Signal Transmission Patterns in Oracle APEX-Based Monitoring Systems with Dynamic IoT Feedback Loops Keshireddy, Srikanth Reddy
International Journal of Engineering, Science and Information Technology Vol 5, No 1 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i1.1450

Abstract

This investigation maps how multi-hop signals travel through Oracle APEX monitoring architectures while absorbing real-time feedback from distributed IoT devices. It reproduces, in code, the relay of packets between nodes and sketches the tilt of delivery rates when loss, spacing, and retransmission limits shift with the network load. The platform weaves SAP telemetry with Honeycomb Oracle APEX dashboards and MQTT feedback loops, then pours those streams into a simulation engine that spits out curves for signal sag, loop latency, and feedback turnaround. Raw numbers show that tuning the retransmit count and watching hop-length can swing reliability figures almost one-to-one, while edge-smart loops clipped jitter delays by a claimed twenty-two percentage points. This work leaves behind a timed model for picking the quickest feedback paths in cloud-flavoured enterprise IoT setups, hoping to give digital overseers a snappier and more elastic toolkit than they had yesterday.
Integrating Cloud Storage in STEM Education: A Case Study on Collaborative Project-Based Learning Imomova, Umida; Tleuzhanova, Manatzhan; Sattorova, Zilola; Khaydarova, Mahliyo; Doniyarov, Mavlonbek; Nasritdinova, Umida; Saidov, Madilkhan
International Journal of Engineering, Science and Information Technology Vol 5, No 1 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i1.1360

Abstract

When science, technology, engineering, and mathematics (STEM) are combined, these topics provide children with the knowledge and skills they need to become intelligent, responsible adults. The primary teaching approach used in this research was Project-Oriented Problem-Based Learning (Po-PBL), which examined the effects of an integrated STEM education system on students' 21st-century competencies. A one-group quasi-experimental methodology and polling techniques were used to assess the students' understanding before and after the program began. The findings demonstrated that pupils' overall 21st-century abilities significantly improved. This was particularly true for their production skills, which improved from mediocre to excellent. Because Po-PbL requires students to focus on real-life issues and discover answers, it is evident that it is particularly beneficial for students in STEM areas. The research emphasizes the value of incorporating Po-PbL into STEM education to assist students in improving their problem-solving, creativity, teamwork, and communication skills. When students work on projects with their hands, they use what they already know and discover new things. These abilities will help students deal with a challenging situation in the future.
Quantum AI-Enhanced Nanomagnetic Sensors for Biomedical Imaging Biswas, Debarghya; Balkrishna, Sutar Manisha; Aggarwal, Rashi
International Journal of Engineering, Science and Information Technology Vol 5, No 1 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i1.1451

Abstract

An extremely high impact advance in biomedical imaging is quantum AI-enhanced nanomagnetic sensors, where the combination of quantum coherence and nano automotive AI provides ? substantial increase in medical diagnosis precision. This research outlines the QAI-NMS System that utilises quantum dots and nitrogen vacancy (NV) centres in diamond to improve the bio-magnetic sensing capability to sub-picoTesla sensitivity. The AI-driven quantum noise suppression and Quantum Classical Computing are hybrid, and both augment the signal clarity and reduce the quantum decoherence of the signal. The system uses real-time signal optimisation based on deep reinforcement learning, as well as high-fidelity biomedical imaging by the variational quantum algorithms. The conventional methods like MRI and CT are much invasive, radiated, and portable imaging techniques with less sensitivity, but QAI NMS is non-invasive, radiation-free, and portable imaging with higher sensitivity. Other can be developed, such as early cancer detection, neural activity mapping of the brain for a brain computer interface, non-invasive cardiac monitoring, and even to track drug delivery to a given area without actually interfering with the body. A quantitative analysis is provided for signal-to-noise ratio, quantum-assisted resolution enhancement, as well as computational efficiency, and experimental evaluations are presented that exhibit significantly improved signal-to-noise ratio. This study constitutes a paradigm shift in biomedical imaging by merging quantum technologies with AI analytics for realising real-time high-resolution noise-immune imaging. The proposed framework here would have a great application in the next generation of diagnostic tools, offering unparalleled precision in health monitoring as well as medical imaging. The future research will miniaturise, deploy, and augment what appeared quantum in nature to provide the capability for real-time clinical deployment.
Propagation Faults in Real-Time Content Streaming Across Low-Bandwidth Learning Infrastructure Sappa, Ankita
International Journal of Engineering, Science and Information Technology Vol 5, No 1 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i1.1447

Abstract

Delivering live educational videos in places with thin internet pipes is still tough because the signal breaks often, showing up as lost frames, endless rebuffering, or delays that pile on top of one another. In response, this paper builds a detailed simulation tool that tests how these failures unfold across different hardware layouts and network grades, paying special attention to bandwidth, mixed devices, and cache-based buffers. The experiments find that storing key content at the edge cuts the average stalls by more than 40 per cent versus relying on a distant central server, a gap that widens under the 512-kbps cap. A three-dimensional model of delay spread further shows that both pause rates and picture quality drop rise sharply, not linearly, as bandwidth jitter and the number of viewers climb. The work also pinpoints fault link patterns tied to specific protocols and suggests tuning buffer sizes along with smarter retry timings to dampen these cascades. Taken together, the results give clear design tips for rolling out remote learning where infrastructure is weak.
AI-driven Quantum Dot Transistors for Ultra-Low Power Computing Mishra, Archana; Kadao, Anjali Krushna; Rohilla, Shruti
International Journal of Engineering, Science and Information Technology Vol 5, No 1 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i1.1350

Abstract

Since Quantum Dot Transistors (QDTs) provide a transformative approach to ultra-low power computing, yet their optimization is an open problem, a proposed paradigm shift in computing is used as an example application context for creating new processors. The framework of this research is an AI-based approach to dynamically improve the QDT's efficiency and flexibility using reinforcement learning and neuromorphic AI. The intelligent tuning mechanism proposed uses a sample-as-a-service approach to optimize charge transport and lower leakage currents, as well as minimize energy dissipation according to real-time workload. To precisely control and self-adjust from transistor behavior to a varying environmental condition, it integrates a hybrid quantum-classical AI model. Furthermore, the mechanism adopts self-healing features to autonomously reconfigure transistor networks when anomalies are encountered, ensuring fault tolerance and extending device longevity. Simulations are used to validate the proposed methodology, which is shown to improve power efficiency, switching speed, and operational stability a great deal versus conventional low-power transistors. This work takes most of the power of QDTs for next-generation energy-efficient electronics such as IoE, edge computing, and neuromorphic processors by leveraging AI-driven optimization. Their findings provide significant contributions in the emerging field of AI-assisted semiconductor technology toward developing a scalable and intelligent method for designing ultra-low power devices. Future advancements in sustainable computing lie in the performance improvements while decreasing the digital system’s environmental footprint that this research enables.
Utilization of Machine Learning for Stunting Prediction: Case Study and Implications for Pre-Matrical and Pre-Conceptive Midwifery Services Aini, Qurotul; Rahardja, Untung; Sutedja, Indrajani; Spits Warnar, Harco Leslie Hendric; Septiani, Nanda
International Journal of Engineering, Science and Information Technology Vol 5, No 1 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i1.1488

Abstract

Stunting, a global health challenge, affects millions of children, particularly in low- and middle-income countries, and has lasting consequences on cognitive development, physical growth, and overall well-being. Early prediction and intervention are crucial for reducing stunting, especially before conception and during early pregnancy. This paper explores the utilisation of machine learning (ML) for predicting stunting risk in the context of pre-maternal and pre-conceptive midwifery services. By analysing a case study, the research assesses the effectiveness of various machine learning algorithms in identifying stunting risk factors, including maternal health, nutrition, socioeconomic status, and environmental conditions. Using healthcare and demographic data, the study develops predictive models to assist midwives in assessing stunting risks during pre-conception and prenatal phases. The findings demonstrate that ML models, particularly random forest and support vector machine algorithms, outperform traditional risk assessment methods, providing higher accuracy and earlier detection of stunting risk. These models enable midwives to deliver personalised care and targeted interventions, optimising maternal and child health outcomes. The study also highlights the broader implications of integrating machine learning into midwifery services, including improved decision-making, resource allocation, and healthcare efficiency. In conclusion, this research underscores the transformative potential of machine learning in predicting stunting risk and enhancing the effectiveness of pre-maternal and pre-conceptive midwifery services, offering a promising approach to mitigating the global burden of stunting.
Machine Learning-Based Heart Failure Worsening Prediction Model to Build Self-Monitoring Prototype as an Effort to Prevent Readmissions and Maintain Quality of Life Rahardja, Untung; Hartomo, Kristoko Dwi; Sutedja, Indrajani; Kho, Ardi; Kamil, Muhammad Farhan
International Journal of Engineering, Science and Information Technology Vol 5, No 1 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i1.1467

Abstract

Heart failure is a long-term condition of great concern which calls for health care services in cycles. This significantly hampers quality of life for patients and increases costs for the healthcare systems. If the worsening of heart failure could be detected early, the intervention to prevent readmission could be employed, such that readmission would be avoided, enhancing the quality of life for the patient. Accordingly, the paper explains how such a model to predict the worsening of heart failure in patients who are at high risk of this condition has been developed. The model uses information gathered from the Electronic Health Records (EHRs) (Clinical Variables, Vitals, Test Results, and Demographics) to make accurate predictions on patients. As an effective and efficient approach towards achieving this goal, comparison of different algorithms such as random forests, support vector machines and gradient boosting has been employed towards the building of the final model. At this stage, the model is embedded into a user-friendly self-monitoring device, allowing the chronic heart failure patients to assess health indices on the fly with the help of the mobile app and wearable devices. This secondary prevention strategy makes patients more responsible for their health and decreases the number of patients readmitted to the hospital by increasing their functioning and well-being. The paper further projects the future development of other forms of treatment for chronic heart failure, especially at the first line, focusing primarily on the timing and succession.
AI-Assisted 3D-Printed Biomaterial Supercapacitors for Green Energy Storage Kadao, Anjali Krushna; Prashant, Patil Manisha; Sardana, Sunaina
International Journal of Engineering, Science and Information Technology Vol 5, No 1 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i1.1361

Abstract

Advancements of biomaterial-based supercapacitors have been fuelled by the growing demand for sustainable and high-performance energy storage solutions. This work suggests the use of artificial intelligence to develop an AI-assisted 3D printed biomaterial supercapacitor, namely comprising electrode materials optimised by artificial intelligence (AI), bio-based electrolytes, and intelligent performance monitoring to increase efficiency and sustainability. It is an AI-driven approach that selects and optimises the biomaterials: high conductivity, low internal resistance, and excellent charge retention. Porous electrodes can be deliberately engineered on microscales by advanced 3D printing techniques; these perform well in facilitating fast ion diffusion and high energy storage capacity. This is achieved through experimental results of a 45% increase in capacitance, 68% reduction in charge transfer resistance, and 18% improvement in cycle stability on conventional supercapacitors. Moreover, AI-powered predictive maintenance increases the life of the device by 60%, thereby reducing unplanned failure by 60%. The involvement of biodegradable and non-toxic inclusion of materials encourages environmental sustainability, and thus, this supercapacitor is a green alternative for next-generation energy storage applications. This solution is suitable for wearable electronics, renewable energy systems, as well as smart devices, with high efficiency, low environmental impact and intelligent monitoring capability. The energy storage technology presents instances where AI, biomaterials, and 3D printers have the potential to transform the energy storage technology into a scalable, eco-friendly, and intelligent supercapacitor for future energy demands, according to this study.
Algorithms and Modeling for Optimizing Sustainable Energy Systems Jaleel Maktoof, Mohammed Abdul; Shaker, Alhamza Abdulsatar; Nayef, Hamdi Abdullah; Taher, Nada Adnan; Yousif Al Hilfi, Thamer Kadum; Maidin, Siti Sarah
International Journal of Engineering, Science and Information Technology Vol 5, No 1 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i1.1457

Abstract

The global transition toward sustainable energy necessitates intelligent, integrated solutions to overcome the intermittency of renewable sources. This paper presents and validates a comprehensive framework for optimising Hybrid Solar-Wind Energy (HSWE) systems by integrating advanced simulation, machine learning-based forecasting, and metaheuristic optimisation. Using meteorological and operational data from three distinct climate zones, we modelled and analysed a PV-wind-lithium-ion hybrid system. A neural network was employed for precise load forecasting, while Particle Swarm Optimisation (PSO) managed real-time resource allocation and storage dispatch. Comparative analysis reveals that the optimised hybrid system significantly outperforms standalone units, increasing energy production by up to 32%, improving overall energy efficiency to 92.3%, and reducing operational costs by over 36%. The simulation models demonstrated high fidelity, with predictions matching experimental field data with less than 1% error. Furthermore, the integration of predictive fault handling and intelligent load balancing enhanced system reliability, increasing the mean time between failures (MTBF) by over 70% and achieving 97.6% system availability. This research provides a validated, replicable framework for engineers and policymakers, demonstrating a practical pathway to developing efficient, economically viable, and resilient decentralised renewable energy infrastructure to meet global sustainability goals.
A Deep Learning-Based Pipeline for Feature Extraction and Segmentation of Endometriosis Stages: A Comparative Study of Transfer Learning and CDGAN Models Koshy, Soumya; Singh, K. Ranjith
International Journal of Engineering, Science and Information Technology Vol 5, No 1 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i1.1351

Abstract

Top of Form This study proposes a deep learning-driven approach for extracting features and segmenting various stages of endometriosis from ultrasound images. The proposed pipeline integrates transfer learning with pretrained convolutional neural networks (CNNs) and conditional generative adversarial networks (CDGANs) to improve the accuracy and interpretability of the segmentation process. ResNet and DenseNet models are used in transfer learning to fine-tune pre-trained networks that classify the stages of endometriosis, and the performance of the model is improved by applying CDGAN on the dataset through data augmentation. From the comparison, the CDGAN-based method is more accurate and easier to interpret than the transfer learning model, so it is the preferred method for automatic staging of endometriosis. The results show improved accuracy (90%) and a higher F1-score (0.88), with CDGAN delivering the best segmentation results even in the most complex examples. Automating this portion of medical imaging for endometriosis has the potential to result in more informed treatment choices.