Claim Missing Document
Check
Articles

Found 3 Documents
Search
Journal : International Journal of Basic and Applied Science

Data-driven corporate growth: A dynamic financial modelling framework for strategic agility Sihotang, Hengki Tamando; Vinsensia, Desi; Riandari, Fristi; Chandra, Suherman
International Journal of Basic and Applied Science Vol. 13 No. 2 (2024): Sep: Basic and Applied Science
Publisher : Institute of Computer Science (IOCS)

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.35335/ijobas.v13i2.485

Abstract

This research aimed to develop a Dynamic Financial Growth Model (DFGM) to enhance corporate growth by promoting strategic agility through data-driven decision-making. The main objective was to optimize corporate value by integrating real-time data, dynamic decision-making, risk management, and scenario analysis. The research employed a mathematical modelling framework that combined predictive analytics, real options theory, and scenario-based optimization to represent dynamic corporate financial decisions. The numerical example demonstrated how the model adjusts strategic decisions in response to changes in market data and evaluates corporate value under optimistic, pessimistic, and baseline scenarios. The main results indicated that the DFGM is effective in optimizing corporate value by allowing for continuous adjustments and strategic flexibility, distinguishing itself from traditional static financial models that lack real-time adaptability. The findings highlighted the value of incorporating risk constraints and scenario analysis, resulting in a balanced approach that manages both growth and uncertainty. However, the study identified limitations, including the need for empirical validation, more complex predictive analytics, and accounting for behavioral factors affecting decision-making. The conclusion emphasizes that the DFGM provides an adaptable and data-driven framework that enhances corporate strategic agility, making it a valuable tool for managing growth in rapidly changing environments, while also suggesting future research to refine the model's practical application
Fixed Point Theory in Generalized Metric Vector Spaces and their applications in Machine Learning and Optimization Algorithms Vinsensia, Desi; Utami, Yulia
International Journal of Basic and Applied Science Vol. 13 No. 2 (2024): Sep: Basic and Applied Science
Publisher : Institute of Computer Science (IOCS)

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.35335/ijobas.v13i2.504

Abstract

This study introduces a novel formulation of fixed-point theory within Generalized metric spaces, with an emphasis on applications in machine learning optimization and high-dimensional data analysis. Recall on the concept of complete G-metric spaces, we define a generalized contraction condition tailored for operators representing iterative updates in machine learning algorithms. The proposed framework is exemplified through gradient descent with regularization, demonstrating convergence within a non-Euclidean, high-dimensional setting. Results reveal that our approach not only strengthens convergence properties in iterative algorithms but also complements modern regularization techniques, supporting sparsity and robustness in high-dimensional spaces. These findings underscore the relevance of G-metric spaces and auxiliary functions within fixed-point theory, highlighting their potential to advance adaptive optimization methods. Future work will explore further applications across machine learning paradigms, addressing challenges such as sparse data representation and scalability in complex data environments.
Advancing optimization algorithms with fixed point theory in generalized metric vector spaces Vinsensia, Desi; Utami, Yulia; Awawdeh, Benavides Khan; Bausch, Nocedals Bertesh
International Journal of Basic and Applied Science Vol. 13 No. 3 (2024): Dec: Optimization and Artificial Intelligence
Publisher : Institute of Computer Science (IOCS)

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.35335/ijobas.v13i3.621

Abstract

This research develops and evaluates an adaptive parameter-based fixed point iterative algorithm within generalized metric vector spaces to improve stability and convergence speed in optimization problems. The study extends fixed point theory beyond classical metric spaces by incorporating a more flexible structure that accommodates non-Euclidean systems, commonly found in machine learning, data analysis, and dynamic systems optimization. The proposed adaptive fixed point algorithm modifies the conventional iterative method: where the adaptive parameter dynamically adjusts based on the previous iterations: with as a control constant. A numerical case study demonstrates the algorithm’s effectiveness, comparing it with the classical Banach Fixed Point Theorem. Results show that the adaptive method requires fewer iterations to achieve convergence while maintaining higher stability, significantly outperforming the standard approach. The findings suggest that incorporating adaptive parameters in fixed point iterations enhances computational efficiency, particularly in non-convex optimization and deep learning training models. Future research will explore the algorithm’s robustness in high-dimensional spaces, its integration with hybrid optimization techniques, and applications in uncertain and noisy environments.