Term of Award
Summer 2024
Degree Name
Master of Science in Mathematics (M.S.)
Document Type and Release Option
Thesis (open access)
Copyright Statement / License for Reuse
Digital Commons@Georgia Southern License
Department
Department of Mathematical Sciences
Committee Chair
Divine Wanduku
Committee Member 1
Charles Champ
Committee Member 2
Stephen Carden
Committee Member 3
Andrew Sills
Abstract
Classical statistical supervised learning optimization techniques like the Gauss-Newton Iterative Method (GNIM), Weighted Gauss-Newton Iterative Method (WGNIM), Reweighted Gauss-Newton Iterative Method (RGNIM), and Levenberg-Marquart (LM) algorithm extend the nonlinear least squares method. The WGNIM improves model fitting by controlling heteroscedasticity in the linear and nonlinear models. A comparative analysis of the GNIM, WGNIM, RGNIM, and LM methods for fitting nonlinear models is presented. A step-wise diagnosis for structural multicollinearity in the reweighted linearized model is investigated via the Variance Inflation Factor (VIF) to determine variance inflation in the sequence of estimators for the model parameters. Under restricted multicollinearity levels in simulated experiments, the RGNIM outperforms the GNIM with respect to precision, while the LM is most flexible for selecting the initial parameter estimate among all of the algorithms. Meanwhile, RGNIM and WGNIM have longer computational times.
OCLC Number
1446519515
Catalog Permalink
https://galileo-georgiasouthern.primo.exlibrisgroup.com/permalink/01GALI_GASOUTH/1r4bu70/alma9916579249602950
Recommended Citation
Debnath, Tanmoy Kumar, "A Comparative Analysis of a Family of Advanced Iterative Optimization Methods in Nonlinear Regression" (2024). Electronic Theses and Dissertations. 2804.
https://digitalcommons.georgiasouthern.edu/etd/2804
Research Data and Supplementary Material
No
Included in
Applied Statistics Commons, Data Science Commons, Numerical Analysis and Computation Commons, Statistical Methodology Commons