cover
Contact Name
Hairani
Contact Email
matrik@universitasbumigora.ac.id
Phone
+6285933083240
Journal Mail Official
matrik@universitasbumigora.ac.id
Editorial Address
Jl. Ismail Marzuki-Cilinaya-Cakranegara-Mataram 83127
Location
Kota mataram,
Nusa tenggara barat
INDONESIA
MATRIK : Jurnal Manajemen, Teknik Informatika, dan Rekayasa Komputer
Published by Universitas Bumigora
ISSN : 18584144     EISSN : 24769843     DOI : 10.30812/matrik
Core Subject : Science,
MATRIK adalah salah satu Jurnal Ilmiah yang terdapat di Universitas Bumigora Mataram (eks STMIK Bumigora Mataram) yang dikelola dibawah Lembaga Penelitian dan Pengabadian kepada Masyarakat (LPPM). Jurnal ini bertujuan untuk memberikan wadah atau sarana publikasi bagi para dosen, peneliti dan praktisi baik di lingkungan internal maupun eksternal Universitas Bumigora Mataram. Jurnal MATRIK terbit 2 (dua) kali dalam 1 tahun pada periode Genap (Mei) dan Ganjil (Nopember).
Articles 15 Documents
Search results for , issue "Vol 24 No 1 (2024)" : 15 Documents clear
Reducing Transmission Signal Collisions on Optimized Link State Routing Protocol Using Dynamic Power Transmission Mahabbati, Lathifatul; Jatmika, Andy Hidayat; Huwae, Raphael Bianco
MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer Vol 24 No 1 (2024)
Publisher : LPPM Universitas Bumigora

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30812/matrik.v24i1.3899

Abstract

Many devices connected to a network inevitably result in clashes between communication signals. These collisions are an important factor that causes a decrease in network performance, especially affecting Quality of Service (QoS) like throughput, Packet Delivery Ratio (PDR), and end-to-end de- lay, which has a direct impact on the success of data transmission by potentially causing data loss or damage. The aim of this research is to integrate the Dynamic Power Transmission (DPT) algorithm into the Optimized Link State Routing (OLSR) routing protocol to regulate the communication sig- nal strength range. The DPT algorithm dynamically adapts the signal coverage distance based on the density of neighboring nodes to reduce signal collisions. In our protocol, the basic mechanism of a DPT algorithm includes four steps. The Hello message structure of OLSR has been modified to incorporate the ”x-y position” coordinate field data. Nodes calculate distances to neighbors using these coordinates, which is crucial for route discovery, where all nearby nodes can process route re-quests. The results of this research are that DPT-OLSR improves network efficiency in busy areas. In particular, the DPT-OLSR routing protocol achieves an average throughput enhancement of 0.93%, a 94.79% rise in PDR, and reduces end-to-end delay by 45.69% across various variations in node density. The implication of this research result is that the algorithm proposed automatically adapts the transmission power of individual nodes to control the number of neighboring nodes within a de-fined range. This effectively avoids unwanted interference, unnecessary overhearing, and excessive processing by other nodes, ultimately boosting the network’s overall throughput.
Characterizing Hardware Utilization on Edge Devices when Inferring Compressed Deep Learning Models Nabhaan, Ahmad Naufal Labiib; Rachmanto, Rakandhiya Daanii; Setyanto, Arief
MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer Vol 24 No 1 (2024)
Publisher : LPPM Universitas Bumigora

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30812/matrik.v24i1.3938

Abstract

Implementing edge AI involves running AI algorithms near the sensors. Deep Learning (DL) Model has successfully tackled image classification tasks with remarkable performance. However, their requirements for huge computing resources hinder the implementation of edge devices. Compressing the model is an essential task to allow the implementation of the DL model on edge devices. Post-training quantization (PTQ) is a compression technique that reduces the bit representation of the model weight parameters. This study looks at the impact of memory allocation on the latency of compressed DL models on Raspberry Pi 4 Model B (RPi4B) and NVIDIA Jetson Nano (J. Nano). This research aims to understand hardware utilization in central processing units (CPU), graphics processing units (GPU),and memory. This study focused on the quantitative method, which controls memory allocation and measures warm-up time, latency, CPU, and GPU utilization. Speed comparison among inference of DL models on RPi4B and J. Nano. This paper observes the correlation between hardware utilization versus the various DL inference latencies. According to our experiment, we concluded that smaller memory allocation led to high latency on both RPi4B and J. Nano. CPU utilization on RPi4B. CPU utilization in RPi4B increases along with the memory allocation; however, the opposite is shown on J. Nano since the GPU carries out the main computation on the device. Regarding computation, thesmaller DL Size and smaller bit representation lead to faster inference (low latency), while bigger bit representation on the same DL model leads to higher latency.
Variation of Distributed Power Control Algorithm in Co-Tier Femtocell Network Harahap, Fatur Rahman; Isnawati, Anggun Fitrian; Ni'amah, Khoirun
MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer Vol 24 No 1 (2024)
Publisher : LPPM Universitas Bumigora

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30812/matrik.v24i1.3992

Abstract

The wireless communication network has seen rapid growth, especially with the widespread use of smartphones, but resources are increasingly limited, especially indoors. Femtocell, a spectrum-efficient small cellular network solution, faces challenges in distributed power control (DPC) when deployed with distributed users, impacting power levels, and causing interference in the main network. The aim of this research is optimizing user power consumption in co-tier femtocell networks by using the user power treatment. This study proposed the Distributed Power Control (DPC) variation methods such as Distributed Constrained Power Control (DCPC), Half Distributed Constrained Power Control (HDCPC), and Generalized Distributed Constrained Power Control (GDCPC) in co-tier femtocell network. The research examines scenarios where user power converges but exceeds the maximum threshold or remains semi-feasible, considering factors like number of users, distance, channel usage, maximum power values, non-negative power vectors, Signal-to-Interference-plus-Noise Ratio (SINR), and link gain matrix values. In Distributed Power Control (DPC), distance and channel utilization affect feasibility conditions: feasible, semi-feasible, and non-feasible. The result shows that Half Distributed Constrained Power Control (HDCPC) is more effective than Distributed Constrained Power Control (DCPC) in semi-feasible conditions due to its efficient power usage and similar Signal-to-Interference-plus-Noise Ratio (SINR). Half Distributed Constrained Power Control (HDCPC) is also easier to implement than Generalized Distributed Constrained Power Control (GDCPC) as it does not require user deactivation when exceeding the maximum power limit. Distributed Power Control (DPC) variations can shift the power and Signal-to-Interference-plus-Noise Ratio (SINR) conditions from non-convergence to convergence at or below the maximum power level. We concluded that the best performance of Distributed Power Control (DPC) is Half Distributed Constrained Power Control (HDCPC).
Implementation of The Extreme Gradient Boosting Algorithm with Hyperparameter Tuning in Celiac Disease Classification Alfirdausy, Roudlotul Jannah; Ulinnuha, Nurissaidah; Utami, Wika Dianita
MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer Vol 24 No 1 (2024)
Publisher : LPPM Universitas Bumigora

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30812/matrik.v24i1.4031

Abstract

Celiac Disease (CeD) is an autoimmune disorder triggered by gluten consumption and involves the immune system and HLA in the intestine. The global incidence ranges from 0.5%-1%, with only 30% correctly diagnosed. Diagnosis remains challenging, requiring complex tests like blood tests, small bowel biopsy, and elimination of gluten from the diet. Therefore, a faster and more efficient alternative is needed. Extreme Gradient Boosting (XGBoost), an ensemble machine learning technique that utilizes decision trees to aid in the classification of Celiac disease, was used. The aim of this study was to classify patients into six classes, namely potential, atypical, silent, typical, latent and none disease, based on attributes such as blood test results, clinical symptoms and medical history. This research method employs 5-fold cross-validation to optimize parameters that are max depth, n estimator, gamma, and learning rate. Experiments were conducted 96 times to get the best combination of parameters. The results of this research are highlighted by an improvement of 0.45% above the accuracy value with the default XGBoost parameter of 98.19%. The best model was obtained in the trial with parameters max depth of 3, n estimator of 100, gamma of 0, and learning rate of 0.3 and 0.5 after modifying the parameters, yielding an accuracy rate of 98.64%, a sensitivity rate of 98.43%, and a specificity rate of 99.72%. This research shows that tuning the XGBoost parameters for Celiac
Integration of Deep Learning and Autoregressive Models for Marine Data Prediction Mukhlis, Mukhlis; Maulidia, Puput Yuniar; Mujib, Achmad; Muhajirin, Adi; Perdana, Alpi Surya
MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer Vol 24 No 1 (2024)
Publisher : LPPM Universitas Bumigora

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30812/matrik.v24i1.4032

Abstract

Climate change and human activities significantly affect the dynamics of the marine environment, making accurate predictions essential for resource management and disaster mitigation. Deep learning models such as Long Short-Term Memory excel at capturing non-linear temporal patterns, while autoregressive models handle linear trends to improve prediction accuracy. This aim study predicts sea surface temperature, height, and salinity using deep learning compared to Moving Average and Autoregressive Integrated Moving Average methods. The research methods include spatial gap analysis, temporal variability modeling, and oceanographic parameter prediction. The relationship betweenparameters is analyzed using the Pearson Correlation method. The dataset is divided into 80% training and 20% test data, with prediction results compared between Long Short-Term Memory, Moving Average, and Autoregressive models. The results show that Long Short-Term Memory performs best with a Root Mean Squared Error of 0.1096 and a Mean Absolute Error of 0.0982 for salinity at 13 sample points. In contrast, Autoregressive models produce a Root Mean Squared Error of 0.193 for salinity, 0.055 for sea surface height, and 2.504 for sea surface temperature, with a correlation coefficient 0.6 between temperature and sea surface height. In conclusion, the Long Short Term Memory model excels in predicting salinity because it is able to capture complex non-linear patterns. Meanwhile, Autoregressive models are more suitable for linear data trends and explain the relationship between parameters, although their accuracy is lower in salinity prediction. This approach
Cluster Validity for Optimizing Classification Model: Davies Bouldin Index – Random Forest Algorithm Prihandoko, Prihandoko; Jollyta, Deny; Gusrianty, Gusrianty; Siddik, Muhammad; Johan, Johan
MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer Vol 24 No 1 (2024)
Publisher : LPPM Universitas Bumigora

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30812/matrik.v24i1.4043

Abstract

Several factors impact pregnant women’s health and mortality rates. The symptoms of disease in pregnant women are often similar. This makes it difficult to evaluate which factors contribute to a low, medium, or high risk of mortality among pregnant women. The purpose of this research is to generate classification rules for maternal health risk using optimal clusters. The optimal cluster is obtained from the process carried out by the validity cluster. The methods used are K-Means clustering, Davies Bouldin Index (DBI), and the Random Forest algorithm. These methods build optimum clusters from a set of k-tests to produce the best classification. Optimal clusters comprising cluster members withstrong similarities are high-dimensional data. Therefore, the Principal Component Analysis (PCA) technique is required to evaluate attribute value. The result of the research is that the best classification rule was obtained from k-tests = 22 on the 20th cluster, which has an accuracy of 97% to low, mid, and high risk. The novelty lies in using DBI for data that the Random Forest will classify. According to the research findings, the classification rules created through optimal clusters are 9.7% better than without the clustering process. This demonstrates that optimizing the data group has implications for enhancing the classification algorithm’s performance.
Optimizing Currency Circulation Forecasts in Indonesia: A Hybrid Prophet- Long Short Term Memory Model with Hyperparameter Tuning Aziza, Vivin Nur; Syafitri, Utami Dyah; Fitrianto, Anwar
MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer Vol 24 No 1 (2024)
Publisher : LPPM Universitas Bumigora

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30812/matrik.v24i1.4052

Abstract

The core problem for decision-makers lies in selecting an effective forecasting method, particularly when faced with the challenges of nonlinearity and nonstationarity in time series data. To address this, hybrid models are increasingly employed to enhance forecasting accuracy. In Indonesia and other Muslim countries, monthly economic and business time series data often include trends, seasonality, and calendar variations. This study compares the performance of the hybrid Prophet-Long Short-Term Memory (LSTM) model with their individual counterparts to forecast such patterned time series. The aim is to identify the best model through a hybrid approach for forecasting time series data exhibitingtrend, seasonality, and calendar variations, using the real-life case of currency circulation in South Sulawesi. The goodness of the models is evaluated using the smallest Mean Absolute Percentage Error (MAPE) and Root Mean Square Error (RMSE) values. The results indicate that the hybrid Prophet- LSTM model demonstrates superior accuracy, especially for predicting currency outflow, with lower MAPE and RMSE values than standalone models. The LSTM model shows excellent performance for currency inflow, while the Prophet model lags in inflow and outflow accuracy. This insight is valuable for Bank Indonesia’s strategic planning, aiding in better cash flow prediction and currency stock management.
Population Prediction Using Multiple Regression and Geometry Models Based on Demographic Data Safii, M; Setiana, Rika
MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer Vol 24 No 1 (2024)
Publisher : LPPM Universitas Bumigora

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30812/matrik.v24i1.4121

Abstract

Population growth is an important issue because it significantly impacts a country’s growth and development. Large population growth can impact potential resources that drive the pace of the economy and national development. On the other hand, it can also be a problem of poverty, hunger, unemployment, education, health, and others. The government needs to control population growth to balance it with good population quality. Data sourced from the Population and Civil Registration Office of Simalungun Regency, Tanah Java sub-district has a high population and continues to increase every year. The impact of the population increase is that it affects the population’s welfare, most of whom work as laborers and farmers. To overcome this problem, it is necessary to predict the number of people in the future so that the government can make the right decisions and policies in controlling the population. This study aims to make predictions using two models, namely Multiple Linear Regression, to find linear equations and Geometry Models for population growth projections. This study utilizes multiple regression analysis and geometric models using three independent variables, namely birth rate (X1), migration rate (X2), and death rate (X3), as well as one bound variable, population number (Y). This study’s results show that the Tanah Java sub-district population is expected to increase in the next five years (2024-2028). Predictions show that by 2024, the population is expected to reach 61178 people from 59589 in 2023. Based on the results of the study, the conclusion of this study it can be used as a guide for the authorities in planning strategies and resource allocation and making a significant contribution in estimating population development in the Java region so that there will be no population explosion in the future so that it does not have a negative impact.
Segmentation and Classification of Breast Cancer Histopathological Image Utilizing U-Net and Transfer Learning ResNet50 Sudianjaya, Nella Rosa; Fatichah, Chastine
MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer Vol 24 No 1 (2024)
Publisher : LPPM Universitas Bumigora

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30812/matrik.v24i1.4186

Abstract

Breast cancer is the most common type of cancer among various types of cancer. Approximately 1 in 8 women in the United States die from breast cancer. Early screening and accurate diagnosis are essential for prevention and accelerated treatment intervention. Several artificial intelligence methods have emerged to develop effective segmentation, detection, and classification to determine cancer types. Although there has been progress in automated algorithms for breast cancer histopathology image analysis, many of these approaches still face several challenges. This study aims to address the challenges in breast cancer image analysis. This research method uses the development of the U-Net architecture combined with Transfer Learning using ResNet50. The encoder path aims to improve the model’s sensitivity in the segmentation and classification of cancer areas by utilizing deep hierarchical features extracted by ResNet50. In addition, data augmentation techniques are used to create a diverse and comprehensive training dataset, which improves the model’s ability to distinguish between different tissue types and cancer areas. The results of this study are U-Net and ResNet50, which show an average IoU of 0.482 and a Dice coefficient of 0.916. This study concludes that integrating UNet with Transfer Learning ResNet50 improves the segmentation and classification accuracy in breast cancer histopathology images and overcomes the problem of high computational requirements. This approach shows significant potential for improvement in early breast cancer detection and diagnosis.
Development of Smart Charity Box Monitoring Robot in Mosque with Internet of Things and Firebase using Raspberry Pi Anggraini, Nenny; Zulkifli, Zulkifli; Hakiem, Nashrul
MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer Vol 24 No 1 (2024)
Publisher : LPPM Universitas Bumigora

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30812/matrik.v24i1.4209

Abstract

Mosques are the center of Muslim communities’ spiritual and communal life, thus requiring effective financial management. The purpose of this study was to develop a smart donation box robot that utilizes Internet of Things technology to address efficiency and increase transparency in managing donations. The methodology in this study used a prototyping method consisting of Rapid Planning, Rapid Modeling, Construction, and Evaluation stages, which aimed to develop a functional prototype quickly. The results showed that the smart donation box robot detected and counted banknote denominations with varying degrees of success, achieving a detection success rate of 100% for all tested denominations at an optimal sensor distance of 1 cm. However, the detection rate dropped to 42.86% at 0.5 cm and 28.57% at 1.5 cm, highlighting the significant impact of sensor placement on performance. Coin detection was performed accurately, correctly identifying and sorting denominations without error. This enabled real-time financial monitoring via the Telegram application, significantly increasing transparency for mosque administrators and congregants. The conclusion of this study confirms that IoT technology can substantially improve mosque donation management by automating the collection process and providing real-time

Page 1 of 2 | Total Record : 15