How Does Gradient Descent Actually W…
https://nickweimer.com/gradient-descent-and-least-squares/
Least Squares sets the derivatives of the cost function equal to zero and solves for intercept and slope. This finds the minimum of the cost function explicitly. Gradient Descent starts by selecting random values for slope and intercept. Ordinary Least Squares (OLS) is preferred for small datasets or relatively small number of features, while Gradient Descent is preferred for large datasets or large number of features.
Least Squares sets the derivatives of the cost function equal to zero and solves for intercept and slope. This finds the minimum of the cost function explicitly.
Gradient Descent starts by selecting random values for slope and intercept.
Ordinary Least Squares (OLS) is preferred for small datasets or relatively small number of features, while Gradient Descent is preferred for large datasets or large number of features.
DA: 64 PA: 71 MOZ Rank: 31