Claim Missing Document
Check
Articles

Found 2 Documents
Search
Journal : Bulletin of Electrical Engineering and Informatics

Handling concept drifts and limited label problems using semi-supervised combine-merge Gaussian mixture model Ibnu Daqiqil Id; Pardomuan Robinson Sihombing; Supratman Zakir
Bulletin of Electrical Engineering and Informatics Vol 10, No 6: December 2021
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/eei.v10i6.3259

Abstract

When predicting data streams, changes in data distribution may decrease model accuracy over time, thereby making the model obsolete. This phenomenon is known as concept drift. Detecting concept drifts and then adapting to them are critical operations to maintain model performance. However, model adaptation can only be made if labeled data is available. Labeling data is both costly and time-consuming because it has to be done by humans. Only part of the data can be labeled in the data stream because the data size is massive and appears at high speed. To solve these problems simultaneously, we apply a technique to update the model by employing both labeled and unlabeled instances to do so. The experiment results show that our proposed method can adapt to the concept drift with pseudo-labels and maintain its accuracy even though label availability is drastically reduced from 95% to 5%. The proposed method also has the highest overall accuracy and outperforms other methods in 5 of 10 datasets.
Continual learning on audio scene classification using representative data and memory replay GANs Daqiqil ID, Ibnu; Abe, Masanobu; Hara, Sunao
Bulletin of Electrical Engineering and Informatics Vol 14, No 1: February 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/eei.v14i1.8127

Abstract

This paper proposes a methodology aimed at resolving catastropic forgetting problem by choosing a limited portion of the historical dataset to act as a representative memory. This method harness the capabilities of generative adversarial networks (GANs) to create samples that expand upon the representative memory. The main advantage of this method is that it not only prevents catastrophic forgetting but also improves backward transfer and has a relatively stable and small size. The experimental results show that combining real representative data with artificially generated data from GANs, yielded better outcomes and helped counteract the negative effects of catastrophic forgetting more effectively than solely relying on GAN-generated data. This mixed approach creates a richer training environment, aiding in the retention of previous knowledge. Additionally, when comparing different methods for selecting data as the proportion of GAN-generated data increases, the low probability and mean cluster methods performed the best. These methods exhibit resilience and consistency by selecting more informative samples, thus improving overall performance.