A COMPARATIVE ANALYSIS OF GRADIENT DESCENT AND LEAST SQUARES IN LINEAR REGRESSION

Authors

  • Sodikov Ulugbek Beshimovich Author

Abstract

This paper explores and compares the performance of Gradient Descent (GD) and the Least Squares (LS) method for solving linear regression problems. Theoretical aspects, computational efficiency, and accuracy are analyzed. While LS provides a closed-form solution, GD offers scalability advantages for large datasets. Empirical results demonstrate the trade-offs between these methods in terms of convergence rate, numerical stability, and computational cost.

References

[1] Hastie, T., Tibshirani, R., & Friedman, J. (2009). The Elements of Statistical Learning. Springer.

[2] Bishop, C. M. (2006). Pattern Recognition and Machine Learning. Springer.

[3] Kaggle Dataset: Salary Dataset - Simple linear regression.

https://www.kaggle.com/datasets/abhishek14398/salary-dataset-simple-linear-regression?resource=download

Downloads

Published

2025-08-31