cover
Contact Name
Adam Mudinillah
Contact Email
adammudinillah@staialhikmahpariangan.ac.id
Phone
+6285379388533
Journal Mail Official
adammudinillah@staialhikmahpariangan.ac.id
Editorial Address
Jorong Kubang Kaciak Dusun Kubang Kaciak, Kelurahan Balai Tangah, Kecamatan Lintau Buo Utara, Kabupaten Tanah Datar, Provinsi Sumatera Barat, Kodepos 27293.
Location
Kab. tanah datar,
Sumatera barat
INDONESIA
Journal of Computer Science Advancements
ISSN : 30263379     EISSN : 3024899X     DOI : https://doi.org/10.70177/jsca
Core Subject : Science,
Journal of Computer Science Advancements is an international peer-reviewed journal dedicated to interchange for the results of high quality research in all aspect of science, engineering and information technology. The journal publishes state-of-art papers in fundamental theory, experiments and simulation, as well as applications, with a systematic proposed method, sufficient review on previous works, expanded discussion and concise conclusion. As our commitment to the advancement of science and technology, the Journal of Computer Science Advancements follows the open access policy that allows the published articles freely available online without any subscription.
Articles 6 Documents
Search results for , issue "Vol. 3 No. 3 (2025)" : 6 Documents clear
DESIGN OF A WEB-BASED GOODS DELIVERY INFORMATION SYSTEM WITH API SERVICES AND IOT INTEGRATION AT PT. ESA MANDIRI RUBBER Putri, Nadia Natasya; Terisia, Vany; Syamsu, Muhajir
Journal of Computer Science Advancements Vol. 3 No. 3 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i3.2415

Abstract

The advancement of information technology has driven various industrial sectors, including manufacturing, to transform toward more efficient and responsive distribution systems. PT. Esa Mandiri Rubber, a rubber manufacturing company, still relies on manual processes in managing goods delivery, resulting in various issues such as delays, distribution errors, and a lack of transparency in tracking. This study aims to design a web-based goods delivery information system integrated with Internet of Things (IoT) technology and API services. The system development method used is the Waterfall method, which consists of five stages: requirement analysis, system design, implementation, testing, and maintenance. The developed system includes delivery recording, real-time tracking, IoT device data integration, and access to information through a web interface. The results of the study show that the designed system successfully replaces the previously used manual processes, enhances distribution effectiveness, and facilitates easier monitoring and reporting. Thus, this system is capable of improving operational efficiency and the quality of logistics services at PT. Esa Mandiri Rubber.
INTEGRATING COMPUTER VISION AND MECHATRONICS FOR AUTOMATED QUALITY CONTROL IN SMART PRODUCT MANUFACTURING Faizin, Kholis Nur; Al-Fahim, Ahmed; Lahti, Maria
Journal of Computer Science Advancements Vol. 3 No. 3 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i3.2638

Abstract

Smart manufacturing’s (Industry 4.0) complexity demands automated quality control (AQC), as manual inspection is a major bottleneck. A critical gap exists in integrating “passive” Computer Vision (CV) detection with “active” mechatronic intervention, creating a “siloed” research problem. This research aims to design, develop, and validate a closed-loop AQC framework, integrating deep learning CV and mechatronics to autonomously perform the full QC cycle from detection to real-time physical intervention. An experimental systems integration design was employed. A Convolutional Neural Network (CNN) was trained on a 17,000-image dataset. A Robotic Operating System (ROS) framework was utilized as the integration layer for “hand-eye” calibration, synchronizing the CV node with a 6-axis robotic arm on a test rig. The CV model achieved 99.7% mAP (42ms latency) and calibration yielded ±0.35mm precision. The fully integrated system validation achieved a 99.15% Defect Detection Rate (DDR), a 0.11% False Positive Rate (FPR), and a 97.4% Successful Rejection Rate (SRR). The research empirically validates a holistic, closed-loop AQC framework, successfully solving the “siloed” gap. The system provides a proven, scalable blueprint for moving beyond passive detection to fully autonomous quality control in smart manufacturing.
NATURAL LANGUAGE PROCESSING FOR AUTOMATED REQUIREMENT ENGINEERING IN AGILE SOFTWARE DEVELOPMENT Sungkar, Muchamad Sobri; Baibek, Serikbek; Hamdan, Salma
Journal of Computer Science Advancements Vol. 3 No. 3 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i3.2646

Abstract

Manual Requirement Engineering (RE) in Agile software development creates a significant bottleneck. The reliance on natural language user stories at scale results in high-volume backlogs prone to ambiguity, duplication, and incompleteness, leading to costly, downstream development defects. This research aims to design, develop, and empirically validate a novel, hybrid Natural Language Processing (NLP) framework, termed the Agile Requirement Quality (ARQ) framework, to automate the detection of these common requirement defects. The goal is to reduce cognitive load and improve defect detection velocity during backlog refinement. A mixed-methods Design Science Research (DSR) methodology was employed. We developed the ARQ artifact (a hybrid BERT and heuristic model) and validated it both in-vitro against a 5,000-story “gold standard” annotated corpus (Fleiss’ Kappa 0.86) and in-situ through a quasi-experiment with professional Agile teams. The findings demonstrate high efficacy. In-vitro validation achieved high accuracy (overall 95.2%, with F1-scores of 0.87 for ambiguity and 0.94 for duplication). The in-situ experiment was conclusive: the ARQ-assisted team achieved a 73% increase in defect detection and an 87.5% reduction in “defect leakage” compared to the control team, registering high usability (88.5 SUS). This study provides robust empirical evidence that NLP-driven automation is a viable, high-impact strategy for mitigating risk in Agile RE. The framework functions as a practical “augmented intelligence” tool, significantly reducing defect leakage and improving quality assurance velocity.
COMPUTING AT THE EDGE: THE ROLE OF NEUROMORPHIC CHIPS IN INTELLIGENT ROBOTICS Keolavong, Manivone; Vong, Soneva; Phoutthavong, Thipphavone
Journal of Computer Science Advancements Vol. 3 No. 3 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i3.3331

Abstract

The deployment of autonomous mobile robots in resource-constrained environments is currently impeded by the excessive power consumption and latency bottlenecks of traditional Von Neumann architectures. This study investigates the efficacy of neuromorphic computing as a hardware solution for low-power, low-latency edge intelligence, specifically focusing on obstacle avoidance and navigational endurance. A quantitative comparative analysis was conducted benchmarking a Spiking Neural Network (SNN) based control architecture against standard embedded GPU solutions, utilizing event-based vision sensors to evaluate energy efficiency, inference latency, and task success rates. Empirical results demonstrate that the neuromorphic architecture achieved a twenty-fold reduction in power consumption (0.25 W) and sub-millisecond latency, significantly outperforming synchronous baselines while maintaining a 98.2% navigational success rate. The findings validate event-driven processing as a superior paradigm for edge robotics, offering a sustainable path toward "Green Robotics" with extended operational autonomy independent of cloud connectivity.
SAVING THE WORLD IN DNA: RECENT PROGRESS IN DNA STORAGE TECHNOLOGY IN 2026 Abakar, Sonia; Gaba, Brahim; Saleh , Mahamat
Journal of Computer Science Advancements Vol. 3 No. 3 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i3.3336

Abstract

The exponential expansion of the global datasphere is rapidly outpacing the physical and environmental capacity of silicon-based storage media. This study investigates the efficacy of a novel “dynamic-corrective” enzymatic synthesis architecture to address the critical cost and latency bottlenecks hindering the commercial adoption of DNA data storage. Utilizing a quantitative “bits-to-molecules-to-bits” experimental framework, we benchmarked an engineered Terminal Deoxynucleotidyl Transferase (TdT) protocol against traditional phosphoramidite chemistry, encoding a 10-terabyte heterogeneous dataset protected by hybrid LDPC-fountain codes. Empirical results demonstrate that the enzymatic system achieved a sustained write latency of 250 milliseconds per nucleotide and a synthesis cost of $0.05 per megabyte, representing a 70,000-fold reduction over chemical baselines. The system maintained a high logical density of 3.6 bits per nucleotide with 100% data recovery, while silica encapsulation proved stability equivalent to 500 years of aging. We definitively conclude that 2026-era enzymatic synthesis has matured into a scalable industrial solution, validating DNA as a robust, zero-energy archival medium essential for decarbonizing the future of global information infrastructure.
THE AI ENERGY DILEMMA: FINDING THE MIDDLE GROUND BETWEEN HIGH PERFORMANCE AND ECO-FRIENDLINESS Scott, James; Davis, Olivia; Green, Jessica
Journal of Computer Science Advancements Vol. 3 No. 3 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i3.3337

Abstract

The exponential escalation of computational requirements for training and deploying Deep Learning models has precipitated an energy crisis, necessitating a critical reevaluation of the trade-off between algorithmic performance and environmental sustainability. This study aims to reconcile these conflicting demands by developing and validating a novel Dynamic Energy-Aware Pruning (DEAP) framework designed to maximize inference efficiency without compromising predictive accuracy. Employing a rigorous quantitative experimental design, we benchmarked state-of-the-art neural architectures, including ResNet-50 and Large Language Models (LLMs), across diverse hardware environments. The research utilized real-time telemetry to measure total energy consumption (Joules), thermal output, and carbon intensity () against standard accuracy metrics. Empirical results demonstrate that the proposed framework achieved a 42% reduction in energy consumption and stabilized hardware thermals, while maintaining predictive performance within a strict 1.5% non-inferiority margin compared to dense baselines. We definitively conclude that algorithmic sparsity effectively decouples high-level intelligence from excessive power usage, establishing a viable engineering paradigm for “Green AI” that aligns the trajectory of artificial intelligence with global decarbonization targets.

Page 1 of 1 | Total Record : 6