cover
Contact Name
Furizal
Contact Email
sjer.editor@gmail.com
Phone
+6282386092684
Journal Mail Official
sjer.editor@gmail.com
Editorial Address
Jl. Poros Seroja, Kesra, Kepenuhan Barat Sei Rokan Jaya, Kec. Kepenuhan, Kab. Rokan Hulu, Riau
Location
Kab. rokan hulu,
Riau
INDONESIA
Scientific Journal of Engineering Research
ISSN : -     EISSN : 31091725     DOI : https://doi.org/10.64539/sjer
Core Subject : Engineering,
The Scientific Journal of Engineering Research (SJER) is a peer-reviewed and open-access scientific journal, managed and published by PT. Teknologi Futuristik Indonesia in collaboration with Universitas Qamarul Huda Badaruddin Bagu and Peneliti Teknologi Teknik Indonesia. The journal is committed to publishing high-quality articles in all fundamental and interdisciplinary areas of engineering, with a particular emphasis on advancements in Information Technology. It encourages submissions that explore emerging fields such as Machine Learning, Internet of Things (IoT), Deep Learning, Artificial Intelligence (AI), Blockchain, and Big Data, which are at the forefront of innovation and engineering transformation. SJER welcomes original research articles, review papers, and studies involving simulation and practical applications that contribute to advancements in engineering. It encourages research that integrates these technologies across various engineering disciplines. The scope of the journal includes, but is not limited to: Mechanical Engineering Electrical Engineering Electronic Engineering Civil Engineering Architectural Engineering Chemical Engineering Mechatronics and Robotics Computer Engineering Industrial Engineering Environmental Engineering Materials Engineering Energy Engineering All fields related to engineering By fostering innovation and bridging knowledge gaps, SJER aims to contribute to the development of sustainable and intelligent engineering systems for the modern era.
Articles 5 Documents
Search results for , issue "Vol. 1 No. 3 (2025): September" : 5 Documents clear
Early Detection of Brain Tumors: Performance Evaluation of AlexNet and GoogleNet on Different Medical Image Resolutions Muis, Alwas; Rustiawan, Angga; Oyeyemi, Babatunde Bamidele; Syukur, Abdul; Furizal
Scientific Journal of Engineering Research Vol. 1 No. 3 (2025): September
Publisher : PT. Teknologi Futuristik Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.64539/sjer.v1i3.2025.10

Abstract

Early detection of brain tumors through medical imaging is crucial to improving treatment success rates. This study aims to classify brain tumors using two deep learning models, AlexNet and GoogleNet, by testing three image sizes. The dataset used consists of four classes: glioma, no tumor, meningioma, and pituitary. The test results show that the AlexNet model achieves the best accuracy of 98% at a resolution of 150x150, while GoogleNet shows stable performance with the highest accuracy of 96% at both 150x150 and 200x200 resolutions. The medium resolution (150x150) proves to be optimal for both models, providing the best balance between visual information and processing efficiency. This study highlights the potential use of AlexNet and GoogleNet in brain tumor classification, with opportunities for performance improvement through further development, such as ensemble techniques and the use of a larger dataset.
Effectiveness of Fourier, Wiener, Bilateral, and CLAHE Denoising Methods for CT Scan Image Noise Reduction Kobra, Mst Jannatul; Nakib, Arman Mohammad; Mweetwa, Peter; Rahman, Md Owahedur
Scientific Journal of Engineering Research Vol. 1 No. 3 (2025): September
Publisher : PT. Teknologi Futuristik Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.64539/sjer.v1i3.2025.27

Abstract

The proper reduction of noise inside CTscan Images remains crucial to achieve both better diagnosis results and clinical choices. This research analyzes through quantitative metrics the effectiveness of four popular noise reduction methods which include Fourier-based denoising and Wiener filtering as well as bilateral filtering and Contrast Limited Adaptive Histogram Equalization (CLAHE) applied to more than 500 CTscan Images. The investigated methods were assessed quantitatively through Peak Signal-to-Noise Ratio (PSNR) and Structural Similarity Index (SSIM) while Mean Squared Error (MSE) served as the additional metric for evaluation. The evaluated denoising methods revealed bilateral filtering as the best technique based on its 50.37 dB PSNR and 0.9940 SSIM together with its 0.5967 MSE. Denoising with Fourier-based methods succeeded in removing high-frequency noise however it produced PSNR of 25.89 dB along with SSIM of 0.8138 while maintaining MSE at 167.4976 indicating lost crucial Image information. The performance balance of Wiener filtering resulted in 40.87 dB PSNR and 0.9809 SSIM and 5.3270 MSE that outperformed Fourier denoising in SSIM yet demonstrated higher MSE. CLAHE produces poor denoising outcomes because it achieves the lowest PSNR of 21.51 dB together with SSIM of 0.5707, and the maximum MSE of 459.1894 while creating undesirable artifacts. This research stands out through a full evaluation of four denoising techniques on a big dataset to create more precise analysis than prior research. The research results show bilateral filtering to be the most reliable technique for CTscan Image noise reduction when maintaining picture quality and thus represents a suitable choice for clinical use. This research adds new information to medical imaging research about quality enhancement which directly benefits clinical diagnostics and therapeutic planning.
A Thirdweb-Based Smart Contract Framework for Secure Sharing of Human Genetic Data on the Ethereum Blockchain Famuji, Tri Stiyo; Grancho, Bernadine; Fanani, Galih Pramuja Inggam; Talirongan, Hidear; Sumantri, Raden Bagus Bambang
Scientific Journal of Engineering Research Vol. 1 No. 3 (2025): September
Publisher : PT. Teknologi Futuristik Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.64539/sjer.v1i3.2025.30

Abstract

Human genetic data, crucial for advancing personalized medicine, requires secure and privacy-preserving management solutions. Traditional approaches face challenges in scalability, security, and decentralized access control. This study proposes a blockchain-based framework leveraging Thirdweb and Ethereum smart contracts to address these issues. The framework integrates decentralized storage via IPFS for cost-efficient off-chain genetic data storage, while on-chain smart contracts manage access control, encryption, and audit trails. Utilizing Solidity for smart contract development, the system ensures role-based permissions, wallet-based authentication, and immutable transaction logging. Genetic data in FASTA format, sourced from NCBI, is encrypted and linked to IPFS hashes stored on the blockchain. The architecture supports dual interfaces—command-line for developers and a Thirdweb dashboard for end-users—enabling secure data upload, access, and monitoring. Testing demonstrated functional efficacy in data integrity, access verification, and audit capabilities. Results highlight the system’s ability to enhance privacy, eliminate intermediaries, and provide transparent data governance. The integration of Thirdweb further decentralizes operations, aligning with Web 3.0 principles. Key contributions include a scalable model for genetic data sharing, a customizable smart contract template, and a user-centric design. Future work should explore advanced encryption, real-world healthcare integration, and performance optimization under high-throughput conditions. This research bridges biotechnology and blockchain, offering a robust foundation for secure genomic data ecosystems.
Post-Quantum Cryptography Review in Future Cybersecurity Strengthening Efforts Mu'min, Muhammad Amirul; Safitri, Yana; Saputra, Sabarudin; Sulistianingsih, Nani; Ragimova, Nazila; Abdullayev, Vugar
Scientific Journal of Engineering Research Vol. 1 No. 3 (2025): September
Publisher : PT. Teknologi Futuristik Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.64539/sjer.v1i3.2025.35

Abstract

The development of quantum computing technology brings significant challenges to conventional crypto-graphic systems that are currently widely used in digital data security. Attacks made possible by quan-tum computers have the potential to weaken classical algorithms such as RSA and ECC, so a new ap-proach is needed that can guarantee long-term security. This study aims to systematically review the ef-fectiveness and readiness of the implementation of post-quantum cryptography (PQC) algorithms, espe-cially those that have been recommended by NIST, in order to strengthen the resilience of future cyberse-curity systems. The method used was a structured literature study with comparative analysis of lattice-based (Kyber and Dilithium), code-based (BIKE), and hash-based (SPHINCS+) PQC algorithms. Data are obtained from official documents of standards institutions as well as the latest scientific publications. The results of the analysis show that lattice-based algorithms offer an optimal combination of security and efficiency, and demonstrate high readiness to be implemented on limited devices. Compared to other al-gorithms, Kyber and Dilithium have advantages in terms of performance and scalability. Thus, this re-search contributes in the form of mapping the practical readiness of the PQC algorithm that has not been widely studied in previous studies, and can be the basis for the formulation of future cryptographic adop-tion policies. These findings are expected to help the transition process towards cryptographic systems that are resilient to quantum threats.
Robust Positive-Unlabeled Learning via Bounded Loss Functions under Label Noise Awasthi, Lalit; Danso, Eric
Scientific Journal of Engineering Research Vol. 1 No. 3 (2025): September
Publisher : PT. Teknologi Futuristik Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.64539/sjer.v1i3.2025.314

Abstract

Positive-Unlabeled (PU) learning has become a pivotal tool in scenarios where only positive samples are labeled, and negative labels are unavailable. However, in practical applications, the labeled positive data often contains noise such as mislabeled or outlier instances that can severely degrade model performance. This issue is exacerbated using traditional surrogate loss functions, many of which are unbounded and overly sensitive to mislabeled examples. To address this limitation, we propose a robust PU learning framework that integrates bounded loss functions, including ramp loss and truncated logistic loss, into the non-negative risk estimation paradigm. Unlike conventional loss formulations that allow noisy samples to disproportionately influence training, our approach caps each instance’s contribution, thereby reducing the sensitivity to label noise. We mathematically reformulate the PU risk estimator using bounded surrogates and demonstrate that this formulation maintains risk consistency while offering improved noise tolerance. A detailed framework diagram and algorithmic description are provided, along with theoretical analysis that bounds the influence of corrupted labels. Extensive experiments are conducted on both synthetic and real-world datasets under varying noise levels. Our method consistently outperforms baseline models such as unbiased PU (uPU) and non-negative PU (nnPU) in terms of classification accuracy, area under the receiver operating characteristic curve (ROC AUC), and precision-recall area under the curve (PR AUC). The ramp loss variant exhibits particularly strong robustness without sacrificing optimization efficiency. These results demonstrate that incorporating bounded losses is a principled and effective strategy for enhancing the reliability of PU learning in noisy environments.

Page 1 of 1 | Total Record : 5