cover
Contact Name
Adam Mudinillah
Contact Email
adammudinillah@staialhikmahpariangan.ac.id
Phone
+6285379388533
Journal Mail Official
adammudinillah@staialhikmahpariangan.ac.id
Editorial Address
Jorong Kubang Kaciak Dusun Kubang Kaciak, Kelurahan Balai Tangah, Kecamatan Lintau Buo Utara, Kabupaten Tanah Datar, Provinsi Sumatera Barat, Kodepos 27293.
Location
Kab. tanah datar,
Sumatera barat
INDONESIA
Journal of Computer Science Advancements
ISSN : 30263379     EISSN : 3024899X     DOI : https://doi.org/10.70177/jsca
Core Subject : Science,
Journal of Computer Science Advancements is an international peer-reviewed journal dedicated to interchange for the results of high quality research in all aspect of science, engineering and information technology. The journal publishes state-of-art papers in fundamental theory, experiments and simulation, as well as applications, with a systematic proposed method, sufficient review on previous works, expanded discussion and concise conclusion. As our commitment to the advancement of science and technology, the Journal of Computer Science Advancements follows the open access policy that allows the published articles freely available online without any subscription.
Articles 106 Documents
QUANTUM ADVANTAGE HAS ARRIVED: TANGIBLE IMPACTS ON DRUG DISCOVERY AND NEW MATERIALS Wei, Li; Hui, Zhou; Yang, Liu
Journal of Computer Science Advancements Vol. 3 No. 5 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i5.3325

Abstract

The advancement of computational chemistry is currently stalled by the exponential memory scaling required to simulate strongly correlated electron systems on classical supercomputers. This fundamental barrier significantly impedes the rational design of complex pharmaceuticals and next-generation catalytic materials. This research aims to rigorously validate the immediate utility of Noisy Intermediate-Scale Quantum (NISQ) processors, demonstrating that “Quantum Advantage” has shifted from a theoretical milestone to a practical industrial reality. We employed a comparative research design utilizing the Variational Quantum Eigensolver (VQE) algorithm on the IBM Eagle quantum processor. The study targeted the electronic structure of iron-sulfur clusters and KRAS-G12C inhibitor binding sites, benchmarking quantum outputs against classical Density Functional Theory (DFT) and Full Configuration Interaction (FCI) standards, utilizing Zero-Noise Extrapolation for error mitigation. Results indicate that quantum simulations achieved chemical accuracy (within 1.6 kcal/mol) for these complex systems, whereas classical methods failed with deviations exceeding 8 kcal/mol. The data confirms that quantum hardware can now resolve electronic correlations invisible to classical approximation. We conclude that quantum computing offers a tangible, immediate pathway to accelerate discovery cycles in drug development and material science, necessitating the integration of hybrid quantum workflows into modern R&D pipelines.
THE QUBIT PARADOX: WHY MORE QUBITS ACTUALLY LOWER ERROR RATES? Fujita, Miku; Suzuki, Ren; Nishida, Daiki
Journal of Computer Science Advancements Vol. 3 No. 5 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i5.3329

Abstract

Physical qubits intuitively introduces greater cumulative noise and control complexity. This “Qubit Paradox” presents a fundamental barrier to scalability, suggesting that larger systems might become inherently less stable. This research aims to rigorously validate the threshold theorem, defining the precise boundary where topological protection overcomes physical noise accumulation. We utilized high-fidelity Monte Carlo simulations of Rotated Surface Codes, scaling from distance d=3 to d=9, under realistic circuit-level noise models including leakage and crosstalk. Decoding was executed using the Minimum Weight Perfect Matching (MWPM) algorithm to analyze logical failure rates across 109 error correction cycles. Results identify a critical physical error threshold of approximately 0.57%. Below this value, logical error rates exhibited exponential suppression via power-law decay, reducing by seven orders of magnitude at distance-9. Conversely, systems operating above this threshold demonstrated error amplification with increased scale. We conclude that the paradox resolves only when individual gate fidelity surpasses the threshold, mandating that hardware optimization must precede quantitative scaling. These findings establish a validated roadmap for the transition from the NISQ era to fault-tolerant architecture.
MIMICKING THE HUMAN BRAIN: NEUROMORPHIC ARCHITECTURE SOLUTIONS FOR AI ENERGY EFFICIENCY Anh, Nguyen Tuan; Anh, Le Thi Lan; Thao, Pham Thanh
Journal of Computer Science Advancements Vol. 3 No. 5 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i5.3330

Abstract

The exponential proliferation of Artificial Intelligence (AI) is currently constrained by the “memory wall” and excessive power consumption inherent in traditional Von Neumann architectures. This study addresses these physical limitations by proposing a bio-inspired neuromorphic architecture that integrates memristive crossbar arrays with event-driven Spiking Neural Networks (SNNs) to mimic biological synaptic efficiency. The research employs a quantitative cross-layer simulation framework to benchmark the proposed design against industry-standard GPUs and TPUs, utilizing standard datasets to evaluate inference latency, power dissipation, and classification accuracy. Results indicate that the neuromorphic architecture achieves a reduction in energy consumption by orders of magnitude (0.12 pJ/operation) compared to baseline accelerators, with power usage scaling linearly with input sparsity. Although a minor trade-off in precision was observed due to device stochasticity, the system maintained a competitive classification accuracy of 92.4%. The study concludes that mimicking the asynchronous nature of the human brain offers a sustainable paradigm for “Green AI,” validating neuromorphic computing as a critical solution for overcoming the energy crisis in next-generation edge intelligence and autonomous systems.
COMPUTING AT THE EDGE: THE ROLE OF NEUROMORPHIC CHIPS IN INTELLIGENT ROBOTICS Keolavong, Manivone; Vong, Soneva; Phoutthavong, Thipphavone
Journal of Computer Science Advancements Vol. 3 No. 3 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i3.3331

Abstract

The deployment of autonomous mobile robots in resource-constrained environments is currently impeded by the excessive power consumption and latency bottlenecks of traditional Von Neumann architectures. This study investigates the efficacy of neuromorphic computing as a hardware solution for low-power, low-latency edge intelligence, specifically focusing on obstacle avoidance and navigational endurance. A quantitative comparative analysis was conducted benchmarking a Spiking Neural Network (SNN) based control architecture against standard embedded GPU solutions, utilizing event-based vision sensors to evaluate energy efficiency, inference latency, and task success rates. Empirical results demonstrate that the neuromorphic architecture achieved a twenty-fold reduction in power consumption (0.25 W) and sub-millisecond latency, significantly outperforming synchronous baselines while maintaining a 98.2% navigational success rate. The findings validate event-driven processing as a superior paradigm for edge robotics, offering a sustainable path toward "Green Robotics" with extended operational autonomy independent of cloud connectivity.
GOODBYE LATENCY: WHY FUTURE MEDICAL DEVICES NEED ARTIFICIAL BRAINS Koh, Megan; Tan, Marcus; Wong, Lucas
Journal of Computer Science Advancements Vol. 3 No. 4 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i4.3332

Abstract

The transition of medical technology from passive monitoring to autonomous, closed-loop intervention is critically impeded by the latency and power inefficiencies of traditional Von Neumann computing architectures. This study investigates the efficacy of neuromorphic hardware as a solution, aiming to validate a bio-inspired architecture capable of sub-millisecond decision-making for life-critical applications. Employing a rigorous hardware-in-the-loop simulation framework, we benchmarked a custom Spiking Neural Network (SNN) against industry-standard microcontrollers, utilizing large-scale cardiac and neurological datasets to evaluate inference speed, energy consumption, and signal fidelity. Quantitative results reveal that the neuromorphic system achieved a 94% reduction in end-to-end latency and a thirty-eight-fold improvement in energy efficiency compared to the digital baseline. The event-driven architecture successfully maintained 96.4% diagnostic accuracy while operating within a negligible thermal envelope suitable for implantation. These findings definitively establish that mimicking biological asynchronous processing eliminates fatal temporal delays, validating neuromorphic “artificial brains” as the essential technological foundation for the next generation of responsive, privacy-secure, and energy-autonomous medical implants.
FUTURE DATA CENTERS: LIQUID IMMERSION COOLING INNOVATION TO WITHSTAND AI HEAT Thai, Aom; Krit, Pong; Lek, Siri
Journal of Computer Science Advancements Vol. 3 No. 4 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i4.3333

Abstract

The exponential escalation of computational density required by modern Artificial Intelligence (AI) and Large Language Models has pushed traditional air-cooled data center infrastructures to their thermodynamic limits. This study investigates the efficacy of single-phase liquid immersion cooling as a transformative solution to manage the extreme thermal flux of next-generation AI accelerators. Adopting a quantitative experimental design, we benchmarked a high-density GPU cluster submerged in a proprietary dielectric fluid against a standard forced-air baseline under intensive MLPerf training workloads. The research focused on evaluating key performance indicators, including Power Usage Effectiveness (PUE), processor junction temperatures, and total energy consumption over a 168-hour stress test. Results demonstrate that the immersion architecture achieved a near-ideal PUE of 1.04, representing a 34% efficiency improvement over the air-cooled control group. Furthermore, the liquid medium maintained GPU core temperatures 20°C lower than the baseline, effectively eliminating thermal throttling events and enhancing computational stability. The study concludes that shifting from aerodynamic to hydrodynamic cooling is not merely an efficiency upgrade but a physical prerequisite for the sustainable scaling of exascale AI infrastructure, offering a viable pathway to decarbonize the expanding digital economy.
SAVING THE WORLD IN DNA: RECENT PROGRESS IN DNA STORAGE TECHNOLOGY IN 2026 Abakar, Sonia; Gaba, Brahim; Saleh , Mahamat
Journal of Computer Science Advancements Vol. 3 No. 3 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i3.3336

Abstract

The exponential expansion of the global datasphere is rapidly outpacing the physical and environmental capacity of silicon-based storage media. This study investigates the efficacy of a novel “dynamic-corrective” enzymatic synthesis architecture to address the critical cost and latency bottlenecks hindering the commercial adoption of DNA data storage. Utilizing a quantitative “bits-to-molecules-to-bits” experimental framework, we benchmarked an engineered Terminal Deoxynucleotidyl Transferase (TdT) protocol against traditional phosphoramidite chemistry, encoding a 10-terabyte heterogeneous dataset protected by hybrid LDPC-fountain codes. Empirical results demonstrate that the enzymatic system achieved a sustained write latency of 250 milliseconds per nucleotide and a synthesis cost of $0.05 per megabyte, representing a 70,000-fold reduction over chemical baselines. The system maintained a high logical density of 3.6 bits per nucleotide with 100% data recovery, while silica encapsulation proved stability equivalent to 500 years of aging. We definitively conclude that 2026-era enzymatic synthesis has matured into a scalable industrial solution, validating DNA as a robust, zero-energy archival medium essential for decarbonizing the future of global information infrastructure.
THE AI ENERGY DILEMMA: FINDING THE MIDDLE GROUND BETWEEN HIGH PERFORMANCE AND ECO-FRIENDLINESS Scott, James; Davis, Olivia; Green, Jessica
Journal of Computer Science Advancements Vol. 3 No. 3 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i3.3337

Abstract

The exponential escalation of computational requirements for training and deploying Deep Learning models has precipitated an energy crisis, necessitating a critical reevaluation of the trade-off between algorithmic performance and environmental sustainability. This study aims to reconcile these conflicting demands by developing and validating a novel Dynamic Energy-Aware Pruning (DEAP) framework designed to maximize inference efficiency without compromising predictive accuracy. Employing a rigorous quantitative experimental design, we benchmarked state-of-the-art neural architectures, including ResNet-50 and Large Language Models (LLMs), across diverse hardware environments. The research utilized real-time telemetry to measure total energy consumption (Joules), thermal output, and carbon intensity () against standard accuracy metrics. Empirical results demonstrate that the proposed framework achieved a 42% reduction in energy consumption and stabilized hardware thermals, while maintaining predictive performance within a strict 1.5% non-inferiority margin compared to dense baselines. We definitively conclude that algorithmic sparsity effectively decouples high-level intelligence from excessive power usage, establishing a viable engineering paradigm for “Green AI” that aligns the trajectory of artificial intelligence with global decarbonization targets.
ENCRYPTION APOCALYPSE? PREPARING DATA SECURITY FOR THE QUANTUM COMPUTING ERA Iqbal, Kiran; Ali, Zainab; Aslam, Bilal
Journal of Computer Science Advancements Vol. 3 No. 4 (2025)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v3i4.3338

Abstract

The imminent maturation of quantum computing threatens to nullify the mathematical hardness underpinning global Public Key Infrastructure, creating an urgent “Harvest Now, Decrypt Later” vulnerability. This study investigates the operational feasibility of transitioning to NIST-standardized Post-Quantum Cryptography (PQC) protocols within heterogeneous network environments. Utilizing a rigorous quantitative benchmarking framework, we evaluated the performance of lattice-based primitives, specifically ML-KEM and ML-DSA, against classical standards across high-performance servers and resource-constrained IoT devices. Empirical data reveals a fundamental architectural paradigm shift: while PQC algorithms exhibit superior computational execution speeds, they introduce severe transmission overheads, resulting in memory saturation and packet fragmentation on edge hardware. Results demonstrate that hybrid encryption schemes provide valid risk mitigation but incur statistically significant latency penalties due to expanded artifact sizes. We definitively conclude that the “Encryption Apocalypse” is primarily a bandwidth and memory bottleneck rather than a computational one, mandating the immediate deployment of adaptive crypto-agility frameworks to manage the infrastructural constraints of the post-quantum era.
HUMAN COMPUTER INTERACTION DESIGN ENHANCING USABILITY IN MOBILE COMPUTING APPLICATIONS Prana Utama Sembiring, Afen; Ivander, Filbert; Sufarnap, Erlanie
Journal of Computer Science Advancements Vol. 4 No. 1 (2026)
Publisher : Yayasan Adra Karima Hubbi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.70177/jsca.v4i1.3390

Abstract

The rapid growth of mobile computing applications has intensified the need for effective Human Computer Interaction design to ensure high usability and positive user experience. Mobile applications are used in diverse contexts characterized by limited screen space, touch-based interaction, and frequent interruptions, making usability a critical determinant of system acceptance and sustained use. This study aims to examine how Human Computer Interaction design principles contribute to enhancing usability in mobile computing applications. The research employed a mixed-methods approach combining usability testing, standardized usability questionnaires, and qualitative user interviews to capture both performance-based and perceptual usability outcomes. Quantitative data focused on task completion time, error rates, task success, and perceived usability, while qualitative data explored user interaction experiences and design-related challenges. The results indicate that mobile applications designed with clear navigation structures, consistent visual elements, and effective feedback mechanisms demonstrate significantly higher usability scores, faster task completion, and lower error frequency. Qualitative findings further reveal increased user confidence, reduced cognitive load, and higher satisfaction when interacting with well-designed interfaces. The study concludes that Human Computer Interaction design plays a central role in enhancing usability in mobile computing applications. Systematic integration of user-centered design principles throughout the development process is essential for creating efficient, effective, and satisfying mobile applications across various domains.

Page 10 of 11 | Total Record : 106