Here’s a cleaned-up and structured version of your notes on Gradient Descent, with explanations and clear formatting for better understanding:


Gradient Descent Overview

Objective

Minimize a cost function (e.g., Mean Squared Error) by iteratively updating model parameters (m and c in linear regression) using the gradient (slope) of the cost function.


Basic Gradient Descent Steps

  1. Initialize m and c randomly
  2. Repeat for n iterations or until convergence:

❗ The final m and c are approximations, not the exact global minimum—but the algorithm converges quickly to a useful solution.


Learning Rate (α)


Types of Gradient Descent

1. Batch Gradient Descent