Claim Missing Document
Check
Articles

Found 22 Documents
Search

In-House Refurbishing and Outsourcing Refurbishing Models with Degree of Interchangeability in Product Design Kurdhi, Nughthoh Arfawi; Vania, Kezia Abigail; Widyaningsih, Purnami; Sudibyo, Nugroho Arif
JTAM (Jurnal Teori dan Aplikasi Matematika) Vol 7, No 4 (2023): October
Publisher : Universitas Muhammadiyah Mataram

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.31764/jtam.v7i4.15787

Abstract

Refurbishing is the process of processing used products into products with the quality of new products. Refurbishing can be done by the manufacturer itself (in-house) or the manufacturer can delegate the refurbishing process to other manufacturers (outsourcing). This research aims to construct an in-house refurbishing model and an outsourced refurbishing model, determine the optimum solution, analysis, and application so that optimum benefits are obtained, and compare the in-house refurbishing model and the outsourced refurbishing model. Multivariable function optimization is used to get optimum profit. Judging from the optimum production results, manufacturers who carry out in-house refurbishing choose a higher degree of interchangeability and produce more new products. Products with an interchangeability design are products that can be used to replace similar products with the same function. Based on economic benefits, manufacturers who carry out in-house refurbishing get greater profits than outsourcing refurbishing. Viewed from environmental sustainability, outsourcing refurbishing is more environmentally friendly than in-house refurbishing.
HYBRID INTEGRATION OF BERT AND BILSTM MODELS FOR SENTIMENT ANALYSIS Tambunan, Nicolas Ray Amarco; Saputro, Dewi Retno Sari; Widyaningsih, Purnami
BAREKENG: Jurnal Ilmu Matematika dan Terapan Vol 20 No 2 (2026): BAREKENG: Journal of Mathematics and Its Application
Publisher : PATTIMURA UNIVERSITY

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30598/barekengvol20iss2pp1719-1730

Abstract

The rapid growth of sentiment analysis research has driven increasing interest in deep learning models, particularly transformer-based architectures such as BERT and recurrent neural networks like BiLSTM. While both approaches have shown substantial success in text classification tasks, each presents distinct strengths and limitations. This study aims to analyze the integration of BERT and BiLSTM models to enhance sentiment classification performance by combining contextual and sequential learning. A bibliometric analysis was conducted using VosViewer based on Scopus-indexed publications from 2020 to 2025, identifying four major thematic clusters related to transformer modeling, recurrent architectures, hybrid integration, and methodological advancements. Comparative findings from benchmark datasets, including SST-2, IMDb, and Yelp Reviews, indicate that hybrid BERT–BiLSTM models achieve superior accuracy compared to single models, reaching up to 97.67% on the IMDb dataset. However, this improvement is associated with increased computational complexity. The proposed framework reinforces the integration between BERT’s contextual embeddings and BiLSTM’s sequential modeling, offering a foundation for developing adaptive, and multilingual sentiment analysis systems. The results highlight future directions in optimizing hybrid architectures for efficiency, cross-lingual adaptability, and domain-specific sentiment understanding.