# Keyword Analysis & Research: gradient descent newton method

## Keyword Research: People who searched gradient descent newton method also searched

Is Newton's method faster than gradient descent?

Where applicable, Newton's method converges much faster towards a local maximum or minimum than gradient descent.

What happens if gradient descent encounters a stationary point during iteration?

If gradient descent encounters a stationary point during iteration, the program continues to run, albeit the parameters don’t update. Newton’s method, however, requires to compute for . The program that runs it would therefore terminate with a division by zero error.

Does gradient descent converge or diverge?

Only if it is sufficiently small will gradient descent converge (see the first figure below). If it is too large the algorithm can easily diverge out of control (see the second figure below).

How do you do gradient descent with a function?

In gradient descent we only use the gradient (first order). In other words, we assume that the function ℓ around w is linear and behaves like ℓ ( w) + g ( w) ⊤ s. Our goal is to find a vector s that minimizes this function.