Kogundi Math, Manisha
Unknown Affiliation

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

Optimizing neural radiance field: a comprehensive review of the impact of different optimizers on neural radiance fields Pinjarkar, Latika; Nittala, Aditya; P. Mattada, Mahantesh; Pinjarkar, Vedant; Neole, Bhumika; Kogundi Math, Manisha
Bulletin of Electrical Engineering and Informatics Vol 14, No 1: February 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/eei.v14i1.8315

Abstract

Neural radiance field (NeRF) is a form of deep learning model that may be used to depict 3D scenes from a collection of photos. It has been demonstrated that NeRF can produce photorealistic photographs of fresh perspectives on a scene even from a small number of input images. However, the optimizer that is employed can have a significant impact on the quality of the final reconstruction. Finding an effective optimizer is one of the biggest challenges while learning NeRF models. The optimizer is responsible for making changes to the model's parameters to minimize the discrepancy between the model's predictions and the actual data. We cover the many optimizers that have been used to train NeRF models in this study. We present research results contrasting the effectiveness of multiple optimizers and examine the benefits and drawbacks of each optimizer. For training NeRF models, four different optimizers viz. Adaptive moment estimation (Adam), AdamW, root mean square propagation (RMSProp), and adaptive gradient (Adagrad) are trained. The most effective optimizer for a given assignment will vary depending on a variety of elements, including the size of the dataset, the complexity of the scene, and the level of accuracy that is required.