Claim Missing Document
Check
Articles

Found 1 Documents
Search

Advancing optimization algorithms with fixed point theory in generalized metric vector spaces Vinsensia, Desi; Utami, Yulia; Awawdeh, Benavides Khan; Bausch, Nocedals Bertesh
International Journal of Basic and Applied Science Vol. 13 No. 3 (2024): Dec: Optimization and Artificial Intelligence
Publisher : Institute of Computer Science (IOCS)

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.35335/ijobas.v13i3.621

Abstract

This research develops and evaluates an adaptive parameter-based fixed point iterative algorithm within generalized metric vector spaces to improve stability and convergence speed in optimization problems. The study extends fixed point theory beyond classical metric spaces by incorporating a more flexible structure that accommodates non-Euclidean systems, commonly found in machine learning, data analysis, and dynamic systems optimization. The proposed adaptive fixed point algorithm modifies the conventional iterative method: where the adaptive parameter dynamically adjusts based on the previous iterations: with as a control constant. A numerical case study demonstrates the algorithm’s effectiveness, comparing it with the classical Banach Fixed Point Theorem. Results show that the adaptive method requires fewer iterations to achieve convergence while maintaining higher stability, significantly outperforming the standard approach. The findings suggest that incorporating adaptive parameters in fixed point iterations enhances computational efficiency, particularly in non-convex optimization and deep learning training models. Future research will explore the algorithm’s robustness in high-dimensional spaces, its integration with hybrid optimization techniques, and applications in uncertain and noisy environments.