Claim Missing Document
Check
Articles

Found 2 Documents
Search
Journal : Journal of Applied Data Sciences

Improving Publishing: Extracting Keywords and Clustering Topics Soekamto, Yosua Setyawan; Maryati, Indra; Christian, Christian; Kurniawan, Edwin
Journal of Applied Data Sciences Vol 5, No 2: MAY 2024
Publisher : Bright Publisher

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.47738/jads.v5i2.199

Abstract

Humans, by nature, are inclined to share knowledge across various platforms, such as educational institutions, media outlets, and specialized research publications like journals and conferences. The consistent oversight and evaluation of these publications by ranking bodies serve to maintain the integrity and quality of scholarly discourse on a global scale. However, there has been a decline in the proliferation of such publications in recent times, partly attributed to ethical misconduct within specific segments of the scholarly community. Despite implementing systems such as the Open Journal System (OJS), publishers grapple with the formidable task of managing editorial and review processes. Compounding the multifaceted nature of scholarly content, manual review procedures often lead to considerable time investment. Thus, a pressing need exists for advanced technological solutions to streamline the article selection process, empowering publishers to prioritize articles for review based on topical relevance. This study advocates adopting a comprehensive framework integrating advanced text analysis techniques such as keyword extraction, topic clustering, and summarization algorithms. These tools can be implemented and integrated by connecting with the database of the existing system. By leveraging these tools with the expertise of editorial and review teams, publishers can significantly expedite the initial assessment of submitted articles. Given the rapid technological advancements, publishers must embrace robust systems that enhance efficiency and effectiveness, particularly in reviewer assignments and article prioritization. This research employs the neural network approach of BERT and K-Means clustering to perform keyword extraction and topic clustering. Furthermore, using BERT facilitates accurate semantic understanding and context-aware representation of textual data. Additionally, BERT's pre-trained models enable its fine-tuning capability to allow customization to specific domains or tasks. By harnessing the power of BERT, publishers can gain deeper insights into the content of scholarly articles, leading to more informed decision-making and improved publication outcomes.
Nature-based Hyperparameter Tuning of a Multilayer Perceptron Algorithm in Task Classification: A Case Study on Fear of Failure in Entrepreneurship Saputri, Theresia Ratih Dewi; Kurniawan, Edwin; Lestari, Caecilia Citra; Antonio, Tony
Journal of Applied Data Sciences Vol 6, No 2: MAY 2025
Publisher : Bright Publisher

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.47738/jads.v6i2.539

Abstract

Entrepreneurship plays a key role in generating economic growth, encouraging innovation, and creating job opportunities. Understanding which demographic, psychological, and socio-economic factors contribute to fear of failure in entrepreneurship is essential to developing proper standards in entrepreneurship education and policy. However, it remains challenging to accurately classify these factors, especially when balancing model performance with model complexity in a multilayer perceptron algorithm. An effective model requires the correct parameter setting via a hyperparameter tuning process. Adjusting each hyperparameter by hand requires significant effort and knowledge, as there are frequently multiple combinations to consider. Furthermore, manual tuning is prone to human error and may overlook optimal configurations, resulting in inferior model performance and prediction accuracy. This study evaluates nature-inspired optimization techniques, including particle swarm optimization (PSO), genetic algorithm (GA), and grey wolf optimization (GWO). Several parameters are tuned in the present multilayer perceptron model, including the number of hidden layers and the number of nodes in each hidden layer, learning rate, and activation functions. The used dataset which consists of 39 features from 333 samples captured individual fears, loss score, and computational efficiency as the required amount of time for finding the best parameter combination. Model accuracy performance scores are 45.16%, 53.76%, and 58.61% for GA, PSO, and GWO, respectively. Meanwhile their execution time are 10 minutes, 27 minutes, and 23 minutes, for GA, PSO, and GWO, respectively. Experiment results further reveal that each optimization algorithm has distinct advantages: GA excels at speedy convergence, PSO provides a robust exploration of hyperparameter space, and GWO offers remarkable adaptability to complicated parameter interdependencies. This study provides empirical evidence for the efficacy of nature-inspired hyperparameter modification in improving multilayer perceptron performance for fear of failure categorization tasks.