Claim Missing Document
Check
Articles

Found 1 Documents
Search

Pengembangan Stochastic Gradient Descent dengan Penambahan Variabel Tetap Adimas Tristan Nagara Hartono; Hindriyanto Dwi Purnomo
Jurnal JTIK (Jurnal Teknologi Informasi dan Komunikasi) Vol 7 No 3 (2023): JULY-SEPTEMBER 2023
Publisher : KITA Institute

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.35870/jtik.v7i3.840

Abstract

Stochastic Gradient Descent (SGD) is one of the commonly used optimizers in deep learning. Therefore, in this work, we modify stochastic gradient descent (SGD) by adding a fixed variable. We will then look at the differences between standard stochastic gradient descent (SGD) and stochastic gradient descent (SGD) with additional variables. The phases performed in this study were: (1) optimization analysis, (2) fix design, (3) fix implementation, (4) fix test, (5) reporting. The results of this study aim to show the additional impact of fixed variables on the performance of stochastic gradient descent (SGD).