Gradient descent in Matlab/Octave | by Shaun Enslin - Medium
https://medium.com/geekculture/gradient-descent-in-matlab-octave-954160e2d3fa
Step 1: Load The DatasetStep 2: NormalizeStep 3: Gradient DescentStep 4: Cost FunctionStep 5: Predict Our ResultTo start off, gradient descent needs 3 things 1. Learning rate — we will guess at 0.01 2. Number of repititions — we will guess to 1500 3. We need the following theta’s(ø), which we will start all at zero - ø0 — the intercept - ø1 — theta 1 for 1st feature - ø2 — theta 2 for 2nd feature - ø3 — theta 3 for 3rd feature Gradient descent now applies th...See more on medium.com To start off, gradient descent needs 3 things 1. Learning rate — we will guess at 0.01 2. Number of repititions — we will guess to 1500 3. We need the following theta’s(ø), which we will start all at zero - ø0 — the intercept - ø1 — theta 1 for 1st feature - ø2 — theta 2 for 2nd feature - ø3 — theta 3 for 3rd feature Gradient descent now applies th...
To start off, gradient descent needs 3 things 1. Learning rate — we will guess at 0.01 2. Number of repititions — we will guess to 1500 3. We need the following theta’s(ø), which we will start all at zero - ø0 — the intercept - ø1 — theta 1 for 1st feature - ø2 — theta 2 for 2nd feature - ø3 — theta 3 for 3rd feature Gradient descent now applies th...
DA: 47 PA: 34 MOZ Rank: 56