Claim Missing Document
Check
Articles

Found 32 Documents
Search

Bridging the Gap: Integrating Organizational Change Management with IT Project Delivery Zangana, Hewa Majeed; Ali, Natheer Yaseen; Zeebaree, Subhi R. M.
Sistemasi: Jurnal Sistem Informasi Vol 13, No 5 (2024): Sistemasi: Jurnal Sistem Informasi
Publisher : Program Studi Sistem Informasi Fakultas Teknik dan Ilmu Komputer

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.32520/stmsi.v13i5.4450

Abstract

In today's rapidly evolving technological landscape, the successful implementation of IT projects is increasingly contingent upon effective organizational change management (OCM). This research paper explores the intersection of OCM and IT project delivery, proposing a comprehensive framework that integrates these two critical domains. Through a review of existing literature and analysis of case studies, we identify key challenges and best practices for synchronizing OCM strategies with IT project management processes. Our findings reveal that the alignment of OCM with IT project delivery not only enhances project success rates but also promotes sustainable organizational transformation. This integrated approach ensures that technological advancements are supported by a well-prepared workforce, thereby minimizing resistance and maximizing adoption. The paper concludes with practical recommendations for practitioners aiming to bridge the gap between OCM and IT project delivery, ultimately fostering a more agile and resilient organizational environment.
Systematic Review of Decentralized and Collaborative Computing Models in Cloud Architectures for Distributed Edge Computing Zangana, Hewa Majeed; Mohammed, Ayaz khalid; Zeebaree, Subhi R. M.
Sistemasi: Jurnal Sistem Informasi Vol 13, No 4 (2024): Sistemasi: Jurnal Sistem Informasi
Publisher : Program Studi Sistem Informasi Fakultas Teknik dan Ilmu Komputer

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.32520/stmsi.v13i4.4169

Abstract

This systematic review paper delves into the evolving landscape of cloud architectures for distributed edge computing, with a particular focus on decentralized and collaborative computing models. The aim of this systematic review is to synthesize recent advancements in decentralization techniques, collaborative scheduling, federated learning, and blockchain integration for edge computing. As edge computing becomes increasingly vital for supporting the Internet of Things (IoT) and other distributed systems, innovative strategies are needed to address challenges related to latency, resource management, and data security.The key findings highlight the benefits of latency-aware task management, autonomous serverless frameworks, and the collaborative sharing of computational resources. Additionally, the integration of federated learning and blockchain technologies offers promising solutions for enhancing data privacy and resource allocation. The versatility of edge computing is showcased through its applications in diverse domains, including healthcare and smart cities. Future research directions emphasize the need for optimized resource management, improved security protocols, standardization efforts, and application-specific innovations. By providing a comprehensive review of these developments, this paper underscores the critical role of decentralized and collaborative models in advancing the capabilities and efficiency of edge computing systems.
Deep Learning-based Gold Price Prediction: A Novel Approach using Time Series Analysis Zangana, Hewa Majeed; Obeyd, Salah Ramadan
Sistemasi: Jurnal Sistem Informasi Vol 13, No 6 (2024): Sistemasi: Jurnal Sistem Informasi
Publisher : Program Studi Sistem Informasi Fakultas Teknik dan Ilmu Komputer

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.32520/stmsi.v13i6.4651

Abstract

This paper presents a deep learning-based system for predicting gold prices using historical data. The system leverages Long Short-Term Memory (LSTM), a specialized recurrent neural network architecture, to capture temporal dependencies and patterns in the time series data of gold prices. A comprehensive dataset of historical gold prices is used, and the model is trained on a sequence of past data points to predict future prices. The data is preprocessed using normalization techniques to improve the performance of the model. Experimental results demonstrate the effectiveness of the proposed model in providing accurate price predictions, offering potential utility in financial forecasting and decision-making processes. The system's performance is evaluated through visualization and statistical metrics, illustrating its capacity to track gold price trends and predict future market movements. This work contributes to the growing field of time series forecasting by applying deep learning techniques to financial markets.
Power System Stabilizer Optimization Based on Modified Black‑Winged Kite Algorithm Aribowo, Widi; Abualigah, Laith; Oliva, Diego; B, Nur Vidia Laksmi; Amaliah, Fithrotul Irda; Aziz, As’ad Shidqy; Zangana, Hewa Majeed
Buletin Ilmiah Sarjana Teknik Elektro Vol. 7 No. 4 (2025): December
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.12928/biste.v7i4.14669

Abstract

This article presents a Modified Method for tuning the parameters of a power system stabilizer (PSS). This article suggests a different approach that modifies the Black Kite Algorithm (BKA). The Black Kite (BKA) method is inspired by the migratory and predatory habits of the black kite. BKA combines the Leader and Cauchy mutation strategies to improve the algorithm's capacity for global search and convergence rate. This article includes comparative simulations of the PSS objective function and transient response to verify the effectiveness of the suggested strategy. The study validates the proposed method through comparison with both conventional techniques and the original BKA. Simulation results demonstrate that, when benchmarked against competing algorithms, the proposed method consistently yields optimal performance and exhibits faster convergence in certain scenarios. Notably, it reduces undershoot and overshoot by an average of 65% and 90.22%, respectively, compared to the PSS-Lead Lag method. Furthermore, the proposed approach not only minimizes overshoot and undershoot but also achieves a significantly faster settling time.
AI-Driven Threat Intelligence on Blockchain Using Deep Learning for Decentralized Cyber Risk Prediction Zangana, Hewa Majeed; Beitollahi, Hakem; Muhamad, Sabat Salih; Mohammed, Aquil Mirza; Wani, Sharyar
Control Systems and Optimization Letters Vol 3, No 3 (2025)
Publisher : Peneliti Teknologi Teknik Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.59247/csol.v3i3.262

Abstract

The increasing complexity of cyber threats such as advanced persistent threats (APTs), ransomware, distributed denial-of-service (DDoS), and smart contract exploits requires cybersecurity solutions that go beyond traditional centralized defenses. This paper proposes an AI-driven threat intelligence framework integrated with blockchain technology for decentralized and trustworthy cyber risk prediction. The novelty of the proposed framework lies in its hybrid architecture, where deep learning–based anomaly detection models (including LSTM and autoencoder networks) analyze real-time cybersecurity data—such as blockchain transaction logs, network activity records, and external threat intelligence feeds—while blockchain is used to securely store, validate, and share AI-generated threat intelligence in a tamper-resistant and decentralized manner. Unlike AI-only solutions that suffer from data integrity and trust issues, or blockchain-only approaches that lack intelligent threat detection, the proposed framework combines the strengths of both technologies to enhance detection accuracy and stakeholder trust. Experimental evaluation conducted in a simulated blockchain environment demonstrates a detection accuracy of 96.4%, a false positive rate of 3.6%, and effective identification of multiple attack categories, including smart contract exploits and 51% attacks. While the framework improves security and transparency for inter-organizational security teams, enterprise networks, and supply-chain partners, it also introduces challenges related to computational overhead and blockchain scalability. Overall, the results indicate that integrating AI-driven threat intelligence with blockchain offers a practical and robust solution for decentralized cybersecurity risk prediction.
The Synergy of Blockchain and Cybersecurity: Building Trust in Digital Environments Zangana, Hewa Majeed; Sallow, Zina Bibo; Mustafa, Firas Mahmood; Husain, Mamo Muhamad
Jurnal ELTIKOM : Jurnal Teknik Elektro, Teknologi Informasi dan Komputer Vol. 9 No. 2 (2025)
Publisher : P3M Politeknik Negeri Banjarmasin

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.31961/eltikom.v9i2.1701

Abstract

The rapid expansion of digital ecosystems has intensified concerns about data security, privacy, and trust. Blockchain technology, characterized by its decentralized, immutable, and transparent nature, offers a transformative approach to strengthening cybersecurity. This paper examines the synergy between blockchain and cybersecurity, emphasizing how blockchain’s cryptographic foundations, consensus mechanisms, and smart contracts can mitigate cyber threats, enhance authentication, and ensure data integrity. By analyzing emerging trends, challenges, and real-world applications, this study underscores the potential of blockchain to reinforce digital trust and resilience across diverse sectors. The findings contribute to the ongoing discourse on secure digital environments by proposing an integrated framework for blockchain-based cybersecurity solutions
Sentiment Analysis of the Matahari Application to Provide User Experience Insights using Support Vector Machine Rizal, Moch Arif Samsul; Vitianingsih, Anik Vega; Zangana, Hewa Majeed; Maukar, Anastasia Lidya; Marisa, Fitri
Building of Informatics, Technology and Science (BITS) Vol 7 No 3 (2025): December 2025
Publisher : Forum Kerjasama Pendidikan Tinggi

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.47065/bits.v7i3.8811

Abstract

The expansion of Indonesia's digital commerce ecosystem has pushed retail companies to strengthen the quality of their online services to remain competitive. Matahari, one of the country's leading retail brands, launched its mobile app as a platform for shopping, promotions, and customer interaction. However, user feedback on the Google Play Store indicates persistent problems with system responsiveness, ease of use, and the consistency of promotional information. This study examines sentiment patterns in 2,500 user reviews and classifies them using a Support Vector Machine (SVM) based model that incorporates three kernel types: Linear, RBF, and Polynomial. Before modelling, the text corpus underwent several pre-processing steps—such as tokenization, stopword filtering, and stemming represented numerically using TF-IDF weighting. Among all tested configurations, the Linear kernel produced the strongest results, achieving an accuracy rate of 88%. Despite a moderate distribution across categories (1030 negative, 886 neutral, and 584 positive), the model achieved consistent performance across all classes. Evaluation using Precision, Recall, and F1-Score confirmed the validity of the 88% accuracy without the need for additional sampling techniques. From a scholarly standpoint, this research adds insight into sentiment analysis for retail applications within the Indonesian context by applying a machine-learning approach. In practice, the outcomes highlight areas for improvement, particularly technical stability, the intuitiveness of user flows, and promotional clarity to support a better overall user experience.
AI-Driven Fraud Detection in Digital Banking: A Hybrid Approach using Deep Learning and Anomaly Detection Mohammed, Harman Salih; Sallow, Zina Bibo; Zangana, Hewa Majeed
SISTEMASI Vol 15, No 1 (2026): Sistemasi: Jurnal Sistem Informasi
Publisher : Program Studi Sistem Informasi Fakultas Teknik dan Ilmu Komputer

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.32520/stmsi.v15i1.5757

Abstract

The rapid digital transformation in the banking sector has introduced new opportunities for efficiency and customer convenience but has also amplified the risks of financial fraud. Traditional fraud detection mechanisms, often reliant on static rule-based systems, struggle to keep pace with the dynamic, evolving nature of fraudulent activities. This paper proposes a novel hybrid framework that integrates deep learning models with anomaly detection techniques to enhance the accuracy, robustness, and adaptability of fraud detection in digital banking. The proposed approach leverages a deep neural network (DNN) architecture trained under supervised learning to capture complex transactional patterns and combines it with autoencoder-based unsupervised anomaly detection to uncover previously unseen fraud strategies. Extensive experiments on benchmark financial datasets demonstrate that the hybrid system significantly outperforms state-of-the-art methods in terms of precision, recall, and false-positive reduction. Furthermore, the study highlights the scalability of the approach for real-time banking applications and its potential for multi-institutional deployment, enabling secure inter-bank fraud intelligence sharing without compromising data privacy. Extensive experiments on benchmark financial datasets demonstrate that the hybrid system significantly outperforms state-of-the-art methods in terms of precision, recall, and false-positive reduction. Furthermore, the study highlights the scalability of the approach for real-time banking applications. This work contributes to the growing field of AI-driven financial security by addressing both detection performance and adaptability to emerging fraud behaviors.
Comparative Analysis of Naïve Bayes and K-Nearest Neighbor for Lexicon-Based Emotion Classification of Paxel App User Reviews Salsabilah, Azka; Vitianingsih, Anik Vega; Cahyono, Dwi; Lidya Maukar, Anastasia; Zangana, Hewa Majeed
JOURNAL OF INFORMATICS AND TELECOMMUNICATION ENGINEERING Vol. 9 No. 2 (2026): Issues January 2026
Publisher : Universitas Medan Area

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.31289/jite.v9i2.16516

Abstract

The rapid growth of app-based delivery services has increased the importance of understanding user emotions as an indicator of service quality. User reviews on digital platforms provide valuable insights into customer perceptions, satisfaction levels, and service-related issues. This study aims to compare the performance of Naïve Bayes and K-Nearest Neighbor (KNN) algorithms in classifying user emotions related to the Paxel application. The dataset was collected from Google Play Store and X (Twitter) using web scraping techniques and subsequently processed through text pre-processing stages, including case folding, tokenization, and stopword removal. Emotion labels were assigned using the NRC Indonesian Emotion Lexicon, while feature extraction was performed using the TF-IDF method. To address class imbalance, the Synthetic Minority Oversampling Technique (SMOTE) was applied prior to model training. Experimental results show that the Naïve Bayes model achieved the highest overall accuracy of 90.83% with a weighted F1-score of 0.90, while the KNN model obtained an accuracy of 81.21% and a weighted F1-score of 0.77. Both models performed well in identifying happy, sad, and neutral emotions, whereas anger remained the most challenging class to classify. Overall, Naïve Bayes demonstrated more consistent and reliable performance for sentiment analysis tasks..
A Hybrid Quantum-Classical Optimization Model for Reconfigurable Intelligent Surfaces in 6G Networks Zangana, Hewa Majeed; Sulaiman, Maryam A.
Control Systems and Optimization Letters Vol 4, No 1 (2026)
Publisher : Peneliti Teknologi Teknik Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.59247/csol.v4i1.276

Abstract

Reconfigurable Intelligent Surfaces (RIS) have emerged as a key enabler for sixth-generation (6G) wireless networks by providing programmable control over the radio propagation environment. However, optimizing RIS configurations in large-scale and dynamic 6G scenarios remains a computationally intensive and non-convex problem, particularly under realistic channel conditions involving user mobility, multi-user interference, and fading effects. This paper proposes a hybrid quantum–classical optimization framework that integrates a Variational Quantum Eigensolver (VQE)–based optimization module with classical iterative solvers to efficiently configure RIS phase shifts and reflection coefficients. The quantum component facilitates probabilistic exploration of the high-dimensional and combinatorial search space associated with large RIS deployments, while the classical component enforces system constraints and ensures convergence stability. Simulation results under realistic 6G channel models demonstrate that the proposed hybrid approach achieves up to 32% faster convergence, 18–25% improvement in spectral efficiency, and notable energy efficiency gains compared to state-of-the-art classical optimization techniques. Furthermore, the framework exhibits scalable performance with increasing RIS element counts and user density, highlighting its suitability for near real-time RIS control under noisy intermediate-scale quantum (NISQ) hardware constraints. These findings indicate that hybrid quantum–classical optimization constitutes a practical and scalable solution for intelligent, adaptive, and energy-efficient RIS-assisted 6G networks.