18 lines
641 B
Markdown
18 lines
641 B
Markdown
# Neural Network and Deep Learning
|
|
## Logistic Regression
|
|
$$\begin{align}
|
|
正向传递\\
|
|
z &= w^Tx + b \\
|
|
a &= \sigma(z) = \frac{1}{1+e^{-x}} \\
|
|
\hat{y} &= L(a) = -ylog(\hat{y}) - (1-y)log(1-\hat{y}) \ \ 其中(\hat{y} = a) \\
|
|
反向传递 \\
|
|
\frac{dL}{da} &= \frac{(a-y)}{a(1-a)} \\
|
|
\frac{da}{dz} &= a(1-a) \\
|
|
dz = \frac{dL}{dz} &= \frac{dL}{da} \cdot \frac{da}{dz} = a-y \\
|
|
dw = \frac{dL}{dw} &= \frac{dL}{dz} \cdot \frac{dz}{dw} = xdz \\
|
|
db = \frac{dL}{db} &= \frac{dL}{dz} \cdot \frac{dz}{db} = dz \\
|
|
w &= w - \eta \cdot dw \\
|
|
b &= b - \eta \cdot db
|
|
\end{align}$$
|
|
正向传递:计算网络输出。
|
|
反向传递:更新模型参数。 |