Compare commits

...

2 Commits

Author SHA1 Message Date
072605d1ff vault backup: 2023-10-17 15:44:10 2023-10-17 15:44:11 +08:00
1ac50a68e8 vault backup: 2023-10-17 15:29:22 2023-10-17 15:29:22 +08:00

View File

@@ -1 +1,18 @@
# Neural Network and Deep Learning
## Logistic Regression
$$\begin{align}
正向传递\\
z &= w^Tx + b \\
a &= \sigma(z) = \frac{1}{1+e^{-x}} \\
\hat{y} &= L(a) = -ylog(\hat{y}) - (1-y)log(1-\hat{y}) \ \ 其中(\hat{y} = a) \\
反向传递 \\
\frac{dL}{da} &= \frac{(a-y)}{a(1-a)} \\
\frac{da}{dz} &= a(1-a) \\
dz = \frac{dL}{dz} &= \frac{dL}{da} \cdot \frac{da}{dz} = a-y \\
dw = \frac{dL}{dw} &= \frac{dL}{dz} \cdot \frac{dz}{dw} = xdz \\
db = \frac{dL}{db} &= \frac{dL}{dz} \cdot \frac{dz}{db} = dz \\
w &= w - \eta \cdot dw \\
b &= b - \eta \cdot db
\end{align}$$
正向传递:计算网络输出。
反向传递:更新模型参数。