cover
Contact Name
Johan Reimon Batmetan
Contact Email
garuda@apji.org
Phone
+6285885852706
Journal Mail Official
danang@stekom.ac.id
Editorial Address
Jl. Majapahit No.304, Pedurungan Kidul, Kec. Pedurungan, Semarang, Provinsi Jawa Tengah, 52361
Location
Kota semarang,
Jawa tengah
INDONESIA
Journal of Technology Informatics and Engineering
ISSN : 29619068     EISSN : 29618215     DOI : 10.51903
Core Subject : Science,
Power Engineering Telecommunication Engineering Computer Engineering Control and Computer Systems Electronics Information technology Informatics Data and Software engineering Biomedical Engineering
Articles 161 Documents
Optimizing AI Performance in Industry: A Hybrid Computing Architecture Approach Based on Big Data Dewi, Maya Utami; Santoso, Lukman; Santoso, Agustinus Budi
Journal of Technology Informatics and Engineering Vol. 3 No. 3 (2024): December (Special Issue: Big Data Analytics) | JTIE: Journal of Technology Info
Publisher : University of Science and Computer Technology

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.51903/jtie.v3i3.201

Abstract

In the era of Industry 4.0, integrating artificial intelligence (AI) and big data analytics in the industrial sector demands high-performance computing infrastructure to handle increasingly complex and voluminous datasets. This study investigates the optimization of AI performance by implementing a hybrid computing architecture, integrating CPUs, GPUs, FPGAs, and edge-cloud computing. The research aims to enhance processing speed, model accuracy, and energy efficiency, addressing the limitations of standalone computing systems. A quantitative methodology was employed, using over 1 TB of industrial data from IoT sensors and production logs. A hybrid architecture was implemented with dynamic workload scheduling to distribute tasks efficiently across computational components. Performance metrics included processing time, model accuracy, energy consumption, and cost analysis. Results demonstrated that hybrid architectures significantly improved performance: the CPU-GPU combination reduced processing times to 650 ms, increased model accuracy to 88.3%, and achieved an energy consumption of 2.1 kWh. Meanwhile, the CPU-FPGA configuration, while slightly less accurate (87.5%), proved more energy-efficient at 1.3 kWh. AI models developed using hybrid systems exhibited superior predictive accuracy, with Mean Squared Error (MSE) as low as 0.0248 and R² of 0.91. The study concludes that hybrid computing architecture is a transformative approach for optimizing AI systems in industrial applications, balancing speed, accuracy, and energy efficiency. These findings provide actionable insights for industries aiming to leverage advanced computing technologies for improved operational efficiency and sustainability. Future research should focus on advanced workload scheduling and cost-effectiveness strategies to maximize the potential of hybrid systems.
The Use of Machine Learning for Efficient Energy Management in Big Data-Based Computing Systems Putra, Toni Wijanarko Adi; Setiawan, Nuris Dwi; Rusito, Rusito
Journal of Technology Informatics and Engineering Vol. 3 No. 3 (2024): December (Special Issue: Big Data Analytics) | JTIE: Journal of Technology Info
Publisher : University of Science and Computer Technology

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.51903/jtie.v3i3.202

Abstract

The rapid growth of digital services has intensified the energy demands of data centers, significantly impacting operational costs and global carbon footprints. This study leverages Machine Learning (ML) and Big Data to optimize energy management in data centers, addressing limitations in prior approaches that overlooked real-time variability. A predictive model utilizing the Random Forest algorithm was developed to reduce energy consumption based on dynamic factors, such as workload and environmental conditions like temperature. The research used a six-month dataset consisting of approximately 3 million data points from an operational data center. After preprocessing the data, the model achieved a high predictive accuracy, reflected by an R-squared value of 0.87. The findings demonstrate that the model reduces energy consumption by an average of 11.17% daily, with peak savings of up to 15.56% during off-peak hours. Key metrics, including a Mean Squared Error (MSE) of 0.034 and a Root Mean Squared Error (RMSE) of 0.184, validated the model's effectiveness and reliability. Statistical tests further confirmed its precision within a 95% confidence interval. This study contributes to the academic field by integrating real-time environmental data into predictive modeling, offering a scalable solution for energy-efficient data center operations. The outcomes support sustainability initiatives by mitigating carbon emissions and reducing operational costs. The findings also provide a framework for applying ML in broader industrial contexts requiring efficient energy management. Future research may explore incorporating additional variables, such as user behavior, to further refine predictive capabilities.
Enhancing AI Model Accuracy and Scalability Through Big Data and Cloud Computing Jamaludin, Haris; Achlison, Unang; Rokhman, Nur
Journal of Technology Informatics and Engineering Vol. 3 No. 3 (2024): December (Special Issue: Big Data Analytics) | JTIE: Journal of Technology Info
Publisher : University of Science and Computer Technology

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.51903/jtie.v3i3.203

Abstract

Data's exponential growth and cloud computing advancements have significantly impacted artificial intelligence (AI) model development. This study investigates how big data techniques integrated with cloud computing enhance the scalability and accuracy of AI models across sectors such as healthcare, business, and cybersecurity. Adopting a qualitative methodology, the research examines secondary data from 2020–2024, including case studies and literature. Key findings reveal that cloud computing enables large-scale data processing with significant efficiency, achieving average speeds of 20–45 seconds for datasets ranging from 50–120 TB/day. AI model accuracy also improved across sectors, increasing by 20% on average—reaching 92% in cybersecurity, 90% in healthcare, and 85% in business applications. The study identifies deep learning algorithms as pivotal for leveraging cloud computing's flexibility, allowing for advanced data analysis and real-time insights. However, challenges in data security and privacy remain critical concerns. This research contributes by highlighting the transformative role of cloud computing in big data management and AI optimization, offering practical insights into enhancing predictive capabilities while addressing operational cost efficiency through scalable infrastructure. The findings emphasize the necessity of robust security protocols to mitigate risks and ensure sustainable AI applications. Future research should explore sector-specific implementations to refine and expand the practical utility of these integrated technologies.
Integrating Big Data and Edge Computing for Enhancing AI Efficiency in Real-Time Applications Susatyono, Jarot Dian; Suasana, Iman Saufik; Rozikin, Khoirur
Journal of Technology Informatics and Engineering Vol. 3 No. 3 (2024): December (Special Issue: Big Data Analytics) | JTIE: Journal of Technology Info
Publisher : University of Science and Computer Technology

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.51903/jtie.v3i3.204

Abstract

Integrating Big Data and Edge Computing is revolutionizing the efficiency of artificial intelligence (AI) systems, particularly in applications requiring real-time responses. This study explores the synergistic role of these technologies in two critical sectors: autonomous vehicles and healthcare. Using a case study approach, real-world datasets and simulation platforms were employed to evaluate improvements in latency, prediction accuracy, and system efficiency. Key findings reveal that Edge Computing reduces latency by 30%, with response times dropping from 150 ms to 105 ms in autonomous vehicles and from 200 ms to 140 ms in healthcare applications. Additionally, leveraging Big Data for AI training enhanced prediction accuracy by 15% for traffic pattern recognition and 12% for patient condition monitoring. Despite these advancements, challenges such as scalability, data security, and interoperability persist, necessitating robust infrastructure and end-to-end encryption solutions. This research highlights the transformative potential of combining Big Data and Edge Computing to optimize AI systems for real-time applications, offering insights into improving operational efficiency and predictive accuracy. The findings are expected to guide future developments in AI technologies, particularly in the context of expanding 5G networks and growing demand for real-time data processing.  
Enhancing Big Data Processing Efficiency in AI-Based Healthcare Systems: A Comparative Analysis of Random Forest and Deep Priyadi, Priyadi; Migunani, Migunani; Sasmoko, Dani
Journal of Technology Informatics and Engineering Vol. 3 No. 3 (2024): December (Special Issue: Big Data Analytics) | JTIE: Journal of Technology Info
Publisher : University of Science and Computer Technology

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.51903/jtie.v3i3.205

Abstract

This research focuses on optimizing the speed of Big Data processing using Artificial Intelligence (AI) in healthcare applications. The study integrates Random Forest (RF) and Deep Learning (DL) algorithms with cloud-based computing systems to improve data processing efficiency. The dataset includes both structured data, such as Electronic Health Records (EHR), and unstructured data, like medical images. The results show that RF performs better with structured data, achieving a lower Mean Squared Error (MSE) and higher R-squared (R²) than traditional methods. Meanwhile, DL achieves superior accuracy and Area Under the Curve (AUC) in processing unstructured data. By utilizing the distributed computing power of Spark on a cloud platform, the processing speed was significantly enhanced, as demonstrated by a statistically significant reduction in processing time (p < 0.05) observed through a t-test analysis comparing Spark-based computing with traditional methods. Despite these improvements, challenges such as data privacy and infrastructure costs remain. Despite these improvements, challenges such as data privacy and infrastructure costs remain. This research provides a robust framework for real-time healthcare data analysis, highlighting its potential to improve decision-making processes and patient outcomes in medical services.
Prediction and Detection of Scam Threats on Digital Platforms for Indonesian Users Using Machine Learning Models Raharjo, Budi; Rudjiono; Fitrianto, Yuli
Journal of Technology Informatics and Engineering Vol. 3 No. 3 (2024): December (Special Issue: Big Data Analytics) | JTIE: Journal of Technology Info
Publisher : University of Science and Computer Technology

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.51903/jtie.v3i3.208

Abstract

Scam threats on digital platforms continue to rise alongside the rapid adoption of technology in Indonesia. The unique characteristics of Indonesian digital users, such as low digital literacy and high social media usage, make them particularly vulnerable to various forms of scams, including phishing, impersonation, and emotional manipulation. This study aims to develop a machine learning-based model for predicting and detecting scams by identifying threat patterns within a local context. The methodology involves collecting a survey-based dataset from Indonesian digital users, capturing language patterns and user interaction behaviors. The dataset was processed through text-cleaning techniques, tokenization, normalization, and representation using TF-IDF and Word Embeddings. The machine learning models employed in this study are Random Forest and Support Vector Machine (SVM), evaluated using accuracy, precision, recall, and F1-score metrics. Hyperparameter tuning was conducted to optimize model performance, while k-fold cross-validation was utilized to minimize the risk of overfitting. The results indicate that the Random Forest model achieved the best performance, with an accuracy of 92.5%, precision of 90.7%, recall of 94.1%, and F1-score of 92.4%. The use of local datasets improved detection accuracy by 7.8% compared to global datasets, highlighting the critical importance of contextual representation in identifying scam patterns specific to Indonesia. The model was also effective in recognizing unique threat patterns, such as the use of informal language and manipulative phrases in scam messages. This study makes a significant contribution to the field of digital security by providing an effective machine learning-based approach to detecting scam threats in Indonesia. Moreover, the findings underscore the importance of developing local datasets and educating users as part of a holistic solution to enhance digital security. These insights emphasize the necessity of incorporating cultural and contextual factors into technology-driven approaches for combating scams in developing countries like Indonesia
Design Framework of Expert System Program in Otolaryngology Disease Diagnosis use Extreme Programming (XP)Method(Case Study in THB Bekasi Hospital) melyani, Melyani; Prasetyo, Trisna Fajar; Rahadjeng, Indra Riyana; Mufid, Zainul; Rafik, Ahmad; Shaura, Rizkiana Karmelia; Daniel, Daniel; Emita, Isyana
Journal of Technology Informatics and Engineering Vol. 3 No. 3 (2024): December (Special Issue: Big Data Analytics) | JTIE: Journal of Technology Info
Publisher : University of Science and Computer Technology

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.51903/jtie.v3i3.209

Abstract

The prevalence of Ear, Nose, and Throat (ENT) diseases presents diagnostic challenges, especially in resource-limited settings. At THB Bekasi Hospital, constrained specialist availability and long consultation queues highlight the need for an accessible diagnostic solution. This study aims to develop an expert system for diagnosing ENT diseases using the Extreme Programming (XP) methodology, incorporating the forward chaining technique for inference. The research includes assessment, knowledge acquisition through expert consultations, system design, and rigorous testing. The system was developed as a mobile application using Android Studio, enabling users to input symptoms and receive real-time diagnostic insights. The knowledge base integrates data from medical experts, synthesizing 11 diseases and 35 symptoms into a robust decision-making framework. The diagnostic process applies predefined rules to ensure accuracy in identifying conditions such as sinusitis, laryngitis, and otitis. Evaluation results demonstrate a 100% match accuracy during testing with 15 test cases, confirming the system’s reliability. The application offers users rapid diagnostic assistance, promoting timely treatment for ENT issues, although it does not substitute medical professionals. By leveraging ubiquitous smartphone access, this system addresses gaps in healthcare accessibility and enhances patient autonomy. This research contributes a scalable framework for deploying expert systems in other medical domains. Future improvements include integrating geolocation services for nearby specialist referrals and adopting backward chaining for more complex diagnoses, thereby broadening its applicability and utility
Scalable and Secure IoT-Driven Vibration Monitoring: Advancing Predictive Maintenance in Industrial Systems Ibrahim, Said Maulana; Go, Eun-Myeong; Iranda, Jennifer
Journal of Technology Informatics and Engineering Vol. 3 No. 3 (2024): December (Special Issue: Big Data Analytics) | JTIE: Journal of Technology Info
Publisher : University of Science and Computer Technology

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.51903/jtie.v3i3.210

Abstract

The rapid evolution of Industry 4.0 has positioned Internet of Things (IoT) technologies as key enablers for smarter industrial operations, particularly in predictive maintenance and machine monitoring. This research proposes an innovative IoT-driven vibration monitoring system that addresses limitations in traditional approaches such as high costs, limited scalability, and insufficient real-time capabilities. Employing low-cost sensors, edge computing, and LoRaWAN-based communication, the framework enables efficient fault detection and operational analysis. Data from industrial machinery was collected over two months and analyzed using advanced signal processing and machine learning techniques to extract meaningful insights. The system demonstrated an accuracy rate of 92%, a detection latency of 150 milliseconds, and extended sensor life to 12 months, marking significant improvements over conventional methods. Furthermore, scalability tests showed stable performance across setups involving up to 500 sensors, even in challenging industrial conditions. This study also highlights cost reductions of 30% and a 25% decline in machine downtime, reinforcing its practical value for industrial applications. By delivering an adaptable, energy-efficient, and secure solution, this research advances the integration of IoT into industrial systems. It lays the groundwork for future enhancements, including real-world testing and multimodal data integration
Transformers in Cybersecurity: Advancing Threat Detection and Response through Machine Learning Architectures Hartono, Budi; Silalahi, Fujiama Diapoldo; Muthohir, Moh
Journal of Technology Informatics and Engineering Vol. 3 No. 3 (2024): December (Special Issue: Big Data Analytics) | JTIE: Journal of Technology Info
Publisher : University of Science and Computer Technology

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.51903/jtie.v3i3.211

Abstract

The increasing sophistication of cyber threats has outpaced the capabilities of traditional detection and response systems, necessitating the adoption of advanced machine learning architectures. This study investigates the application of Transformer-based models in cybersecurity, focusing on their ability to enhance threat detection and response. Leveraging publicly available datasets, including CICIDS 2017 and UNSW-NB15, the research employs a systematic methodology encompassing data preprocessing, model optimization, and comparative performance evaluation. The Transformer model, tailored for cybersecurity, integrates self-attention mechanisms and positional encoding to capture complex dependencies in network traffic data. The experimental results reveal that the proposed model achieves an accuracy of 97.8%, outperforming conventional methods such as Random Forest (92.3%) and deep learning approaches like CNN (94.1%) and LSTM (95.6%). Additionally, the Transformer demonstrates high detection rates across diverse attack types, with rates exceeding 98% for Denial of Service and Brute Force attacks. Attention heatmaps provide valuable insights into feature importance, enhancing the interpretability of the model’s decisions. Scalability tests confirm the model’s ability to handle large datasets efficiently, positioning it as a robust solution for dynamic cybersecurity environments. This research contributes to the field by demonstrating the feasibility and advantages of employing Transformer architectures for complex threat detection tasks. The findings have significant implications for developing scalable, interpretable, and adaptive cybersecurity systems. Future studies should explore lightweight Transformer variants and evaluate the model in operational environments to address practical deployment challenges.
Comparative Study of Feature Engineering Techniques for Predictive Data Analytics Santoso, Lukman; Priyadi
Journal of Technology Informatics and Engineering Vol. 3 No. 2 (2024): Agustus : Journal of Technology Informatics and Engineering
Publisher : University of Science and Computer Technology

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.51903/jtie.v3i2.225

Abstract

In the rapidly evolving era of big data, predictive analytics has become a crucial approach in supporting data-driven decision-making across various sectors such as finance, healthcare, and marketing. However, the effectiveness of predictive models is highly dependent on the quality of features utilized in model training. This study aims to evaluate and compare various feature engineering techniques to enhance the accuracy of predictive models based on Random Forest (RF) and Extreme Gradient Boosting (XGBoost) algorithms. The research employs a quantitative experimental approach by applying different feature engineering techniques, including SHAP-based feature importance, Principal Component Analysis (PCA), and categorical variable encoding. The evaluation results indicate that the implementation of SHAP-based feature importance yields the best outcomes, with a Mean Squared Error (MSE) of 0.150 and a Root Mean Squared Error (RMSE) of 0.387 in the XGBoost model. These values outperform those without feature engineering, which recorded an MSE of 0.230 and an RMSE of 0.479. The combination of PCA and encoding techniques also shows a significant performance improvement with an MSE of 0.160 and an RMSE of 0.400. The XGBoost algorithm consistently demonstrates superior performance compared to RF across various testing scenarios. The contribution of this study lies in its recommendation of appropriate feature engineering techniques to improve the predictive quality of Machine Learning (ML)  models. This research provides insights for researchers and practitioners in developing more effective feature engineering strategies and opens opportunities for exploring advanced techniques in more complex data domains.

Filter by Year

2022 2025


Filter By Issues
All Issue Vol. 4 No. 3 (2025): DECEMBER | JTIE : Journal of Technology Informatics and Engineering Vol. 4 No. 2 (2025): AUGUST | JTIE : Journal of Technology Informatics and Engineering Vol. 4 No. 1 (2025): APRIL | JTIE : Journal of Technology Informatics and Engineering Vol. 3 No. 3 (2024): December (Special Issue: Big Data Analytics) | JTIE: Journal of Technology Info Vol 3 No 2 (2024): Agustus : Journal of Technology Informatics and Engineering Vol. 3 No. 2 (2024): Agustus : Journal of Technology Informatics and Engineering Vol 3 No 1 (2024): April : Journal of Technology Informatics and Engineering Vol. 3 No. 1 (2024): April : Journal of Technology Informatics and Engineering Vol 2 No 3 (2023): December : Journal of Technology Informatics and Engineering Vol. 2 No. 3 (2023): December : Journal of Technology Informatics and Engineering Vol 2 No 2 (2023): August : Journal of Technology Informatics and Engineering Vol. 2 No. 2 (2023): August : Journal of Technology Informatics and Engineering Vol. 2 No. 1 (2023): April : Journal of Technology Informatics and Engineering Vol 2 No 1 (2023): April : Journal of Technology Informatics and Engineering Vol 1 No 3 (2022): Desember: Journal of Technology Informatics and Engineering Vol 1 No 3 (2022): December: Journal of Technology Informatics and Engineering Vol. 1 No. 3 (2022): December: Journal of Technology Informatics and Engineering Vol. 1 No. 2 (2022): August: Journal of Technology Informatics and Engineering Vol 1 No 2 (2022): August: Journal of Technology Informatics and Engineering Vol 1 No 2 (2022): Agustus: Journal of Technology Informatics and Engineering Vol. 1 No. 1 (2022): April: Journal of Technology Informatics and Engineering Vol 1 No 1 (2022): April: Journal of Technology Informatics and Engineering More Issue