Claim Missing Document
Check
Articles

Found 13 Documents
Search

High Performance Computing on Cluster and Multicore Architecture Ahmad Ashari; Mardhani Riasetiawan
TELKOMNIKA (Telecommunication Computing Electronics and Control) Vol 13, No 4: December 2015
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.12928/telkomnika.v13i4.2156

Abstract

High Performance Computing have several issues on architecture, resources, computational model and data. The challenge is establishing the mature architecture with scalable resources. The cluster architecture and multicore architecture implement to produce high performance on computation and process. This research works on architecture development and performance analysis. The cluster architecture build on Raspberry Pi, a single board computer, implement MPICH2. Raspberry Pi cluster build on Raspbian Wheezy operating system and test by metrics computation applications. The multicore architecture build on single computer with Core i5 and Core i7 architecture. The research use himeno98 and himeno16Large tools to analysis the processor and memory allocation. The test run on 1000x1000 matrices and benchmarked with OpenMP. The analysis focuses on CPU Time, FLOPS, and score. The result show on cluster architecture have 2576,07 sec in CPU Time, 86,96 MLPOS, and 2,69 score. The result on Core i5 architecture have 55,57 sec in CPU time, 76,30 MLOPS, and 0,92 score. The result in Core i7 architecture have 59,56 sec CPU Time, 1427,61 MLOPS, and 17,23 score. The cluster and multicore architecture results show that computing process are effected by architecture models. High performance computing architecture that has been built on this result can give learn on the development of HPC architecture models, and baseline performance. In the future it will use for determine the delivery architecture model on HPC and can be test by more variation of load.
The Analyses on Dynamic and Dedicated Resource Allocation on Xen Server Mardhani Riasetiawan; Ahmad Ashari; Irwan Endrayanto
TELKOMNIKA (Telecommunication Computing Electronics and Control) Vol 14, No 1: March 2016
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.12928/telkomnika.v14i1.2321

Abstract

Data center today challenges is not only serve the users, in same time need to establish scalable resources. Data Center manage the resources such as processor, storage, network, and memory in appropriate way to handle to load. In the big data era, load will increase and come in rapid way with large volume data, many type of data, can be stream and batch data, and unknown sources. Resources need to manage with comprehensive strategies to face the characteristic of big data load. Data Center have capabilties on allocate the reosource in dynamic and dedicated ways. The research investigate in the performance of dedycated and dynamic resource allocation to define the reliable strategies on Data Center. The research work on XenServer platform as Data Center. The research define 18 Virtual Machiens both on dedicated and dynamic strategies, use the shared storage mechanism, and resource pools. The research analyze on CPU performances on XenServer1 and XenServer2 that design as cluster Data Center.The test has run on XenServer and resulting the 2 phase of process when Data Center allocate the resources, there are intiation phase and process phase. The research shown that in the intiation phase both dynamic and dedicated strategies still not running, and use the initial resources to establish Data Center. The process phase shown that dynamic and dedicated strategies run and generating the load process. In the process phase it shown the use of memory and CPU Performance stream line into the balance positions. The research result can use for allocating resources is need to define different strategies in initition and process phase.
Implementasi SOA dalam Layanan Emall KUKM Studi Kasus Kementerian Koperasi dan Usaha Kecil dan Menengah RI Wiro Santoso Waas; Lukito Edi Nugroho; Mardhani Riasetiawan
Seminar Nasional Aplikasi Teknologi Informasi (SNATI) 2014
Publisher : Jurusan Teknik Informatika, Fakultas Teknologi Industri, Universitas Islam Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar

Abstract

Pemerintah melalui Kementerian Koperasi danUsaha kecil dan Menengah telah berupaya untuk meningkatkankualitas layanan dengan memanfaatkan teknologi informasi dankomunikasi melalui layanan Ecatalog satu atap. Secara umum,layanan tersebut telah dapat memenuhi kebutuhan promosiproduk dan pemasaran Koperasi dan Usaha kecil dan Menengah(KUKM). Namun untuk kemudahan transaksi membutuhkanadanya layanan otomatisasi validasi pendaftaran, keranjangbelanja, kasir, otomatisasi pembayaran, lacak kiriman online,dan informasi yang realtime. Layanan yang ada saat ini belumbisa mengakomodasi kebutuhan tersebut. Penelitian inibertujuan untuk mengimplementasikan Service OrientedArchitechture (SOA) dalam perancangan otomatisasi layananEcommerce satu atap yang terintegrasi dengan layanan terkait.Perancangan model sistem dilakukan dengan pemodelan UnifiedModelling Language(UML). Aplikasi dibangun dengan bahasapemrograman PHP, database MySQL dan webservice cURLuntuk integrasi sistem. Penelitian ini menghasilkan modelarsitektur impletentasi SOA sebagai solusi sistem terintegrasiantara Emall KUKM, bank, ekspedisi pengiriman, dan SMSgateway. Pengujian dilakukan dengan uji fungsional.
Data Integrity and Security using Keccak and Digital Signature Algorithm (DSA) Muhammad Asghar Nazal; Reza Pulungan; Mardhani Riasetiawan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems) Vol 13, No 3 (2019): July
Publisher : IndoCEISS in colaboration with Universitas Gadjah Mada, Indonesia.

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.22146/ijccs.47267

Abstract

Data security is a very important compilation using cloud computing; one of the research that is running and using cloud technology as a means of storage is G-Connect. One of the developments made by the G-Connect project is about data security; most of the problems verification of the data sent. In previous studies, Keccak and RSA algorithms have implemented for data verification needs. But after a literature study of other algorithms that can make digital signatures, we found what is meant by an algorithm that is better than RSA in rectangular speeds, namely Digital Signature Algorithm (DSA).DSA is one of the key algorithms used for digital signatures, but because DSA still uses Secure Hash Algorithm (SHA-1) as an algorithm for hashes, DSA rarely used for data security purposes, so Keccak is used instead of the hash algorithm on DSA. Now, Keccak become the standard for the new SHA-3 hash function algorithm. Because of the above problems, the focus of this research is about data verification using Keccak and DSA. The results of the research are proven that Keccak can run on DSA work system, obtained a comparison of execution time process between DSA and RSA where both use Keccak.
The Development of IoT Compression Technique To Cloud Kartika Sari; Mardhani Riasetiawan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems) Vol 13, No 4 (2019): October
Publisher : IndoCEISS in colaboration with Universitas Gadjah Mada, Indonesia.

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.22146/ijccs.47270

Abstract

The main problem of data transmission is how to reduce the length of data packet delivery, so it can reduce the time of sending data. One method that can be used to reduce the data size is by compressing the data size. Data compression is a technique for compressing data to get the data with smaller size than the original size so that it can shorten the data exchange timeThis study aims to develop the data compression techniques by modifying and combining the coding and modelling techniques based on the RAKE algorithm. This study testing experiments use 4 different methods in 5 different time-periods to determine the value of the compression, decompression efficiency parameters, and the data transmission time parameters.The result of this study is the data coding technique that using decimal to binary converter data and the modeling technique by calculating the residue from the sensor value will produce data in small sizes and get a compression efficiency value of 45%. For coding techniques using ASCII and modeling techniques with XOR operations will produce bigger size data and the compression efficiency value of 71%. In testing data decompression, the decompression efficiency value of 100%, there is no data loss.
The Evaluation QS-WFQ Scheduling Algorithm For IoT Transmission To Cloud Hirzen Hasfani; Mardhani Riasetiawan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems) Vol 14, No 1 (2020): January
Publisher : IndoCEISS in colaboration with Universitas Gadjah Mada, Indonesia.

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.22146/ijccs.48157

Abstract

This study using the Weighted Fair Queue scheduling algorithm when the weights can change and calculated based on changes in the average queue size in the buffer. This algorithm divides the priorities of each sensor into three priorities, namely high, medium and low priority. Each queue is given a weight that is adjusted to the resource requirements of each traffic. High priority data will take precedence, but medium and low priority data will remain underserved and guaranteed by network resources.The results of this study show packet loss ratio when the ratio of the number of buffers and the amount of data is 1: 3 with variations in the number of high, medium and low priority buffers 75: 75: 150 and 50: 50: 200 is 0%. The delay time in the high priority and the medium priority buffer has almost the same delay time when data is transmitted, whereas for the low priority buffer increased in the delay time.
Hashtag war: 2019 Presidential election rhetoric in Indonesia Avin Fadilla Helmi; Mardhani Riasetiawan; Acintya Ratna Priwati; Itsna Mawadatta Rahma; Arlianto Arlianto; Ramadhan Dwi Marvianto; Rinanda Rizky Amalia Shaleha
HUMANITAS: Indonesian Psychological Journal Vol 17, Number 2: August 2020
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/humanitas.v17i2.17045

Abstract

Twitter as a social media has become increasingly used, including by its use by each Presidential candidate’s stronghold to launch a campaign to influence prospective voters’ electoral decisions in the 2019 Presidential Election in Indonesia. One strategy used in such a campaign on Twitter was by disseminating hashtags that were expected to become trending topics on Twitter. The dissemination of these hashtags aimed to build political rhetoric that can influence prospective voters’ electoral decisions. Thus, this study sought to explore the patterns of hashtags disseminated by each candidate’s stronghold to build political rhetoric and find out public sentiments in the posted Twitter contents. The number of tweets with #Jokowi2Periode and #2019GantiPresiden hashtags during the period of the 2019 Indonesian presidential and vice-presidential debates that were successfully downloaded using MAXQDA 18.1.1 software was 92,276. The research findings revealed that the distribution pattern of the #Jokowi2Periode hashtag tended to be more scattered (decentralized) by relying on the actor’s presentation and the actor’s speed in responding to tweets. In contrast, the spread of the #2019GantiPresiden hashtag was more centralized by relying on communication channels on Twitter. These two distribution patterns are discussed with the perspective of cyber psychology, through cuesfiltered-in and cues-filtered-out theories. 
PENGEMBANGAN APLIKASI INFORMATION GATHERING MENGGUNAKAN METODE HYBRID SCAN BERBASIS GRAPHICAL USER INTERFACE Mardhani Riasetiawan; Akas Wisnuaji; Dedy Hariyadi; Tri Febrianto
Cyber Security dan Forensik Digital Vol. 4 No. 1 (2021): Edisi Mei 2021
Publisher : Fakultas Sains dan Teknologi UIN Sunan Kalijaga Yogyakarta

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.14421/csecurity.2021.4.1.2449

Abstract

Sebagian analis keamanan sistem dan jaringan komputer menyatakan aplikasi atau alat bantu pengujian berbasis Command-line Interface (CLI) sangat mempermudah pekerjaan. Namun, tidak banyak aplikasi tersebut tidak komprehensif baik cara menganalisis maupun hasil laporannya. Laporan pada proses pengujian keamanan sistem dan jaringan komputer diharapkan minimal terdiri dari dua tipe, yaitu keperluan manajemen dan tim teknis. Tulisan ini diusulkan pengembangan aplikasi atau alat bantu pengujian keamanan sistem dan jaringan komputer yang komprehensif dan memiliki laporan yang memudahkan tim manajemen dan tim teknis. Pada pengembangan ini menggunakan bahasa pemrograman Python dengan module TKInter untuk menghasilkan aplikasi berbasis Graphical User Interface (GUI). Dengan menggunakan aplikasi GUI harapannya dapat digunakan oleh siapapun. Fokus pengembangan aplikasi ini yaitu pada tahapan Information Gathering yang menggunakan metode Hybrid Scan,yaitu: Passive dan Active. Passive Scan menggunakan 11 Application Programming Interface (API) sedangkan Active Scan menggunakan Socket Module Python dan berberapa aplikasi native yang berjalan di GNU/Linux.
Analisis Topik dan Aktor pada Diskusi GeNose C19 Andi Budiansyah; Mardhani Riasetiawan; Achmad Djunaedi; Ilmi Afrizal Rachim
Jurnal ILMU KOMUNIKASI Vol. 21 No. 1 (2024)
Publisher : Departemen Ilmu Komunikasi FISIP Universitas Atma Jaya Yogyakarta

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.24002/jik.v21i1.7020

Abstract

The GeNose C19 has rised a discussion on Twitter. This study seeks to analyze and categorize topics and actors who have a significant role in spreading knowledge on the usage of GeNose C19 on Twitter between March 1st, 2020, and December 20th, 2021. The findings of this study include various topic, namely pertaining to its mechanisms and operations, superiority, marketing approval, user experience, and product comparison. Actors who played a significant role in spreading knowledge were players from the media, colleges, and government. In addition, state-owned companies play an important part in distributing technical knowledge to the public.
Klasifikasi Level Banjir Menggunakan Random Forest dan Support Vector Machine Qamarani, Larasati Syarafina; Riasetiawan, Mardhani
IJEIS (Indonesian Journal of Electronics and Instrumentation Systems) Vol 14, No 2 (2024): October
Publisher : IndoCEISS in colaboration with Universitas Gadjah Mada, Indonesia.

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.22146/ijeis.97043

Abstract

Floods are one of the most common natural disasters in Indonesia. This study analyzes the impact of each flood event by examining factors such as duration, water level, and the number of affected individuals to identify flood characteristics based on severity. Climate variables such as temperature, humidity, rainfall, and wind speed were investigated as parameters characterizing flood occurrences. The primary objective of this research is to classify flood levels using Random Forest and Support Vector Machine (SVM) algorithms, and to evaluate the accuracy of these classifications using a Confusion Matrix. The outcomes are intended to inform decision-making processes during floods, thereby aiming to minimize associated losses. The research utilized historical flood data from the DKI Jakarta BPBD, accessed through the Satu Data Jakarta website, and climate data from the BMKG Geophysical Station, covering the period from 2013 to 2020. The Random Forest classification system demonstrated exceptional performance, achieving an accuracy of 99.21%. Similarly, the SVM classification system performed robustly, with an accuracy of 98.43%. Both models initially exhibited overfitting during the early stages of model development. However, this issue is diminished as the dataset size increases, thereby enhancing the models' generalization capabilities.