cover
Contact Name
Yuhefizar
Contact Email
jurnal.resti@gmail.com
Phone
+628126777956
Journal Mail Official
ephi.lintau@gmail.com
Editorial Address
Politeknik Negeri Padang, Kampus Limau Manis, Padang, Indonesia.
Location
,
INDONESIA
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi)
ISSN : 25800760     EISSN : 25800760     DOI : https://doi.org/10.29207/resti.v2i3.606
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) dimaksudkan sebagai media kajian ilmiah hasil penelitian, pemikiran dan kajian analisis-kritis mengenai penelitian Rekayasa Sistem, Teknik Informatika/Teknologi Informasi, Manajemen Informatika dan Sistem Informasi. Sebagai bagian dari semangat menyebarluaskan ilmu pengetahuan hasil dari penelitian dan pemikiran untuk pengabdian pada Masyarakat luas dan sebagai sumber referensi akademisi di bidang Teknologi dan Informasi. Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) menerima artikel ilmiah dengan lingkup penelitian pada: Rekayasa Perangkat Lunak Rekayasa Perangkat Keras Keamanan Informasi Rekayasa Sistem Sistem Pakar Sistem Penunjang Keputusan Data Mining Sistem Kecerdasan Buatan/Artificial Intelligent System Jaringan Komputer Teknik Komputer Pengolahan Citra Algoritma Genetik Sistem Informasi Business Intelligence and Knowledge Management Database System Big Data Internet of Things Enterprise Computing Machine Learning Topik kajian lainnya yang relevan
Articles 1,046 Documents
Penerapan Deep Learning dalam Deteksi Penipuan Transaksi Keuangan Secara Elektronik Faried Zamachsari; Niken Puspitasari
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) Vol 5 No 2 (2021): April 2021
Publisher : Ikatan Ahli Informatika Indonesia (IAII)

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (810.357 KB) | DOI: 10.29207/resti.v5i2.2952

Abstract

The rapid development of information technology coupled with an increase in public activity in electronic financial transactions has provided convenience but has been accompanied by the occurrence of fraudulent financial transactions. The purpose of this research is how to find the best model to be implemented in the banking payment system in detecting fraudulent electronic financial transactions so as to prevent losses for customers and banks. Fraud detection uses machine learning with ensemble and deep learning with SMOTE. Financial transaction data is taken from bank payment simulations built with the concept of Multi Agent-Based Simulation (MABS) by banks in Spain. To build the best model, not only pay attention to the accuracy value, but the value of precision is a value that needs attention. A precision score is very important for fraud prevention. Fraud detection gets the best results without the SMOTE process by using deep learning with an accuracy score of 99.602% and precision score of 90.574%. By adding SMOTE, it will increase the accuracy score and precision score with the best model produced in the Extra Trees Classification with an accuracy score of 99.835% and precision score of 99.786%.
Klasifikasi Sentimen pada Twitter Terhadap WHO Terkait Covid-19 Menggunakan SVM, N-Gram, PSO Noor Hafidz; Dewi Yanti Liliana
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) Vol 5 No 2 (2021): April 2021
Publisher : Ikatan Ahli Informatika Indonesia (IAII)

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (336.337 KB) | DOI: 10.29207/resti.v5i2.2960

Abstract

On March 2020 World Health Organization (WHO) has declared Covid-19 as global pandemic. As special agency of United Nation who responsible for international public healthy, WHO has done various actions to reduce this pandemic spreading rate. However, the handling of Covid-19 by WHO is not free from a number of controversies that gave rise to criticism and public opinion on the Twitter platform. In this research, a machine learning based classifier model has been made to determine the opinion or sentiment of the tweet. The dataset used is a set of tweets containing the phrase WHO and Covid-19 in period of March 1st until May 6th 2020 consisting of 4000 tweets with positive sentiments and 4000 tweets with negative sentiments. The proposed classifier model combined Support Vector Machine (SVM), N-Gram and Particle Swarm Optimization (PSO). The classifier model performance is evaluated using the value of Accuracy, Precision, Recall, and Area Under ROC Curve (AUC). Based on experiments conducted, the combination of SVM, N-gram (bigram), and PSO produced a pretty good performance in classifying tweet sentiment with values of Accuracy 0,755, Precision 0,719, Recall 0,837, and AUC 0,844.
Evaluasi Parameter RAW Berdasarkan Multirate Pada IEEE 802.11ah: Simulasi Kinerja Optimum Jaringan IoT Haris Mustaqin; Teuku Yuliar Arif; Rizal Munadi
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) Vol 5 No 2 (2021): April 2021
Publisher : Ikatan Ahli Informatika Indonesia (IAII)

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (652.075 KB) | DOI: 10.29207/resti.v5i2.2961

Abstract

IEEE 802.11ah WLAN is a technology standard for IoT networks because it can provide a higher transmission range and data rate than WPAN and LPWAN. To manage channel access up to 8191 at the MAC layer IEEE 802.11ah a Restricted Access Window scheme was introduced. Generally, evaluation and optimization of RAW parameters are only based on constant data rates without taking into account the mutirate support for PHY AP and STA IoT IEEE 802.11ah. This study uses an open source-based NS-3 network simulator. Simulation analysis is run by calculating the value of throughput, delay, packet loss, and energy consumption of each node. Based on testing the effect of the number of slots on throughput, it shows that the resulting throughput values ​​fluctuate with stable dominance, depending on the number of slots used. The effect of the number of slots on packet loss shows that the packet loss value is low for each slot because more packets can be accommodated in the RAW slot queue. The effect of the number of slots on energy consumption decreases at some data rates and some lower energy consumption values, thereby saving energy consumption.
Prediksi Jumlah Produksi Akibat Penyebaran Covid-19 Menggunakan Metode Fuzzy Takagi-Sugeno Khofifah Putriyani; Tenia Wahyuningrum; Yogo Dwi Prasetyo
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) Vol 5 No 2 (2021): April 2021
Publisher : Ikatan Ahli Informatika Indonesia (IAII)

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (635.036 KB) | DOI: 10.29207/resti.v5i2.2973

Abstract

Global Bakery is a food company engaged in bread production that is having difficulty determining how much bread will be produced in the event of a pandemic. This study aims to help predict the amount of bread that will be produced during a pandemic. With the benefit of making it easier for companies to determine the amount of bread to be produced. Data obtained from Global Bakery and the official website of Covid-19 Bekasi Regency from March 20, 2020 to April 20, 2020. The author uses the Fuzzy Takagi-Sugeno method to predict the amount of bread that must be produced by Global Bakery during a pandemic with the following stages: fuzzification, rule formation, calculating ɑ-predicate and zi value, then calculating defuzification. Then an evaluation is carried out using the Mean Absolute Percentage Error (MAPE). This study uses Matlab's GUI tools in implementing the Predictor program. The Fuzzy Takagi-Sugeno method is able to predict the amount of bread production at Global Bakery with optimal results, where if the sales are 180 pieces, the remaining sales are 289, and the number of positive cases of Covid-19 is 6 people with the actual production number of 469 pieces, then The prediction results obtained were 347 units. The results of the calculations that have been done obtained the results of accuracy with a good category, namely with a MAPE value of 18.6%.
Pengambilan Keputusan Sistem Penjaminan Mutu Perguruan Tinggi menggunakan MOORA, SAW, WP, dan WSM Sunardi; Abdul Fadlil; Ryan Fitrian Pahlevi
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) Vol 5 No 2 (2021): April 2021
Publisher : Ikatan Ahli Informatika Indonesia (IAII)

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (413.839 KB) | DOI: 10.29207/resti.v5i2.2977

Abstract

Higher Education Quality Assurance (QA) is regulated in Quality Standards and the number of criteria as well as its relationship with the implementation of the Quality Assurance System (QAS), namely the Internal and External Quality Assurance System (IHSG). . The research is focused on analyzing 4 decision-making methods or Decision Support Systems (DSS) for QAS in MAC. The purpose of this study is to classify standard data and MAC criteria in business processes into a database that is integrated with the QAS decision-making method. The analysis was carried out on 4 multi-criteria decision-making methods that will be used in the QAS-MAC decision-making process, namely: Moora, SAW, WP, and WSM. These methods were tested on a quality standard database and then assessed by comparison, namely relevance, features, accuracy, precision, reliability, effectiveness, efficiency, strengths, and weaknesses. Decision Making Methods as a determinant of Business Process priorities become information for PTMA Leaders in predicting strategic activities. The value of the method analysis shows that 4 decision-making methods are Moora (75%), SAW (75%), WP (94%), and WSM (94%).
High Scalability Document Clustering Algorithm Based On Top-K Weighted Closed Frequent Itemsets Gede Aditra Pradnyana; Arif Djunaidy
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) Vol 5 No 2 (2021): April 2021
Publisher : Ikatan Ahli Informatika Indonesia (IAII)

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (548.593 KB) | DOI: 10.29207/resti.v5i2.2987

Abstract

Documents clustering based on frequent itemsets can be regarded a new method of documents clustering which is aimed to overcome curse of dimensionality of items produced by documents being clustered. The Maximum Capturing (MC) technique is an algorithm of documents clustering based on frequent itemsets that is capable of producing a better clustering quality in compared to other similar algorithms. However, since the maximum capturing technique employed frequent itemsets, it still suffers from such several weaknesses as the emergence of items redundancy that may still cause curse of dimensionality, difficult to determine the minimum support value from a set of documents to be clustered, and no weighting on items incurred to the resulting frequent itemsets. To cope with those various weaknesses, in this research, an algorithm of documents clustering based on weighted top-k closed frequent itemsets, which is called as Weighted Maximum Capturing (WMC) algorithm, is developed. The proposed algorithm involves the frequent pattern tree algorithm to mine closed frequent itemsets from a set of documents without specifying the minimum support value of items to be generated. Experimental results showed that improvement on the resulting clustering accuracy was produced. The resulting average values of F-measure of 0.713 and purity of 0.721 with improvement ratio of 1.4% for F-measure and 2% for purity. Nevertheless, results of the scalability test showed very significant improvement. The WMC algorithm only requires the average computing time of 623.77 minutes, 518.05 minutes faster than the average computing time required by the MC algorithm.
Arsitektur Moisture Meter dengan Capacitive Sensing dan Serverless IoT Untuk Hidroponik Fertigasi I Wayan Aditya Suranata; I Gede Humaswara Prathama
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) Vol 5 No 2 (2021): April 2021
Publisher : Ikatan Ahli Informatika Indonesia (IAII)

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (610.997 KB) | DOI: 10.29207/resti.v5i2.2993

Abstract

The current agricultural systems generally uses chemical fertilizers as a growth booster in order to meet the global food needs of 7 billion people and all of their livestock. But unfortunately not all are aware of the great danger behind such an overuse, unmetered application of chemical fertilizers, freely in an open field for the survival of the planet and its population. Thanks to technological advances, especially in the field of instrumentation and communication technology, the problem of increasing efficiency and avoiding such overuse should be minimized properly. In this study, the researchers tried to apply capacitive moisture sensor technology and serverless Internet of Things to the moisture meter instrument in the hydroponic drip fertigation system with roasted husk planting media. Capactive sensor technology has the advantage of corrosion resistance when applied to planting media containing high humidity and low alkalinity. By using a serverless IoT architecture, it is possible to monitor from anywhere via the internet, without involving complicated and expensive infrastructure. Based on the results of the prototype testing, it is known that the instruments built can work properly. The results of monitoring system conditions such as temperature and free heap appear stable. The reading results of the two sensors also run steadily, without fluctuations and variations in the reading that exceed 5%. The process of remote monitoring and data logging to serverless IoT is monitored to be stable with a data recording success rate of 99.8%.
Embedded Device Berbasis PLC pada Miniatur Konveyor untuk Pengoperasian Simulator Rejection System Muhamad Wildan; Arief Goeritno; Joki Irawan
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) Vol 5 No 2 (2021): April 2021
Publisher : Ikatan Ahli Informatika Indonesia (IAII)

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (786.344 KB) | DOI: 10.29207/resti.v5i2.2994

Abstract

A PLC-based embedded device on a miniature conveyor machine for operating a rejection system has been designed and constructed. The research objectives, namely (i) design and manufacture of an integrated system, (ii) making program structure based on a ladder diagram, and (iii) measuring the performance of an integrated system. Integrated system assembled from a miniature of the conveyor machines and belt installation, and installing the dc motor, while the rejection system assembly by placing several sensors, installing stepper motor, and wiring to the PLC system. Programming based on ladder diagram carried out by determining algorithms, compiling the ladder diagram, addressing the input/output, and compiling and uploading the program from PC to PLC. The performance of the integrated system was observed when (a) observation during synchronization, (b) observations of sensor readings while the rejection system simulator is operating, and (c) observation and measurement of the processing time of the rejection arm. Overall results have been obtained in the form of a PLC-based embedded device for the rejection system simulator on observations of the condition of the bottle caps for beverage packaging. Based on the overall observation, the PLC-based embedded device has functioned to operate the rejection system can be implemented at a manufacturing scale.
Random Forest Algorithm to Investigate the Case of Acute Coronary Syndrome Eka Pandu Cynthia; M. Afif Rizky A.; Alwis Nazir; Fadhilah Syafria
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) Vol 5 No 2 (2021): April 2021
Publisher : Ikatan Ahli Informatika Indonesia (IAII)

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (677.337 KB) | DOI: 10.29207/resti.v5i2.3000

Abstract

This paper explains the use of the Random Forest Algorithm to investigate the Case of Acute Coronary Syndrome (ACS). The objectives of this study are to review the evaluation of the use of data science techniques and machine learning algorithms in creating a model that can classify whether or not cases of acute coronary syndrome occur. The research method used in this study refers to the IBM Foundational Methodology for Data Science, include: i) inventorying dataset about ACS, ii) preprocessing for the data into four sub-processes, i.e. requirements, collection, understanding, and preparation, iii) determination of RFA, i.e. the "n" of the tree which will form a forest and forming trees from the random forest that has been created, and iv) determination of the model evaluation and result in analysis based on Python programming language. Based on the experiments that the learning have been conducted using a random forest machine-learning algorithm with an n-estimator value of 100 and each tree's depth (max depth) with a value of 4, learning scenarios of 70:30, 80:20, and 90:10 on 444 cases of acute coronary syndrome data. The results show that the 70:30 scenario model has the best results, with an accuracy value of 83.45%, a precision value of 85%, and a recall value of 92.4%. Conclusions obtained from the experiment results were evaluated with various statistical metrics (accuracy, precision, and recall) in each learning scenario on 444 cases of acute coronary syndrome data with a cross-validation value of 10 fold.
Klasifikasi Citra Pigmen Kanker Kulit Menggunakan Convolutional Neural Network Luqman Hakim; Zamah Sari; Handhajani Handhajani
Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) Vol 5 No 2 (2021): April 2021
Publisher : Ikatan Ahli Informatika Indonesia (IAII)

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (331.56 KB) | DOI: 10.29207/resti.v5i2.3001

Abstract

Skin cancer is a very common form of cancer that can be found in the United States with annual treatment costs exceeding $ 8 billion. New innovations in the classification and detection of skin cancer using artificial neural networks continue to develop to help the medical and medical world in analyzing images accurately and accurately. Researchers propose to classify skin cancer pigments by focusing on two classes, namely non-melanocytic malignant and benign, where the skin cancer category which is classified into the non-melanocytic class is Actinic keratoses, Basal cell carcinoma. While skin cancers that are classified into Benign are Benign keratosis like lesions, dermatofibrama, vascular lessions. The method used in this study is Convolutional Neural Network (CNN) with a model architecture using 8 Convolutional 2D layers which have filters (16, 16, 32, 32, 64, 64, 128, 128). The first input layers are (20,20). and the following layers (5,5 and 3,3), the types of pooling used in this study are MaxPooling and AveragePooling. The Fully Connected Layer used is (256, 128) and uses a Dropout (0.2). The dataset is obtained from the International Skin Imaging Collaboration (ISIC) 2018 with a total of 10015 images. Based on the results of the test and evaluation reports, an accuracy of 75% is obtained. with the highest precision and recall values ​​found in the Benign class, namely 0.80 and 0.82 respectively and the f1_score value of 0.81.

Page 38 of 105 | Total Record : 1046