[ 机器学习 - 吴恩达 ] 单变量线性回归 | 2-4 梯度下降


函数:\(J(\theta_0,\theta_1)\)
目标:\(\begin{matrix} min\\ \theta_0,\theta_1 \end{matrix}\)** \(J(\theta_0,\theta_1)\)
大纲:

  • 从某一\(\theta_0, \theta_1\)开始
  • 不断改变\(\theta_0, \theta_1\),以减少\(J(\theta_0,\theta_1)\),直到其达到最小值。

梯度下降算法

repeat until convergence {
\(\theta_j := \theta_j - \alpha\frac{\partial}{\partial \theta_j}J(\theta_0,\theta_1)\)??\((for\ j = 0\ and\ j = 1\))
}

正确:同步更新

tmp0 \(:= \theta_0 - \alpha\frac{\partial}{\partial \theta_0}J(\theta_0,\theta_1)\)
tmp1 \(:= \theta_1 - \alpha\frac{\partial}{\partial \theta_1}J(\theta_0,\theta_1)\)
\(\theta_0 :=\) temp0
\(\theta_1 :=\) temp1

不正确:

tmp0 \(:= \theta_0 - \alpha\frac{\partial}{\partial \theta_0}J(\theta_0,\theta_1)\)
\(\theta_0 :=\) temp0
tmp1 \(:= \theta_1 - \alpha\frac{\partial}{\partial \theta_1}J(\theta_0,\theta_1)\)
\(\theta_1 :=\) temp1

相关