循环神经网络进阶
1.GRU
2.LSTM
3.Deep RNN
4.Bidirection NN
1.GRU
RNN存在的问题:梯度较容易出现衰减或爆炸(BPTT)
?控循环神经?络:捕捉时间序列中时间步距离较?的依赖关系
1.1数学表达式
\[
R_{t} = σ(X_tW_{xr} + H_{t?1}W_{hr} + b_r)\\
Z_{t} = σ(X_tW_{xz} + H_{t?1}W_{hz} + b_z)\\
\widetilde{H}_t = tanh(X_tW_{xh} + (R_t ⊙H_{t?1})W_{hh} + b_h)\H_t = Z_t⊙H_{t?1} + (1?Z_t)⊙\widetilde{H}_t
\]
1.2结构
- 重置?(reset gate):有助于捕捉时间序列?短期的依赖关系;
- 更新?(update gate):有助于捕捉时间序列??期的依赖关系。
1.3实现
- 官方实现:https://pytorch.org/docs/1.3.0/nn.html#gru
- 手写实现:
2.LSTM
2.1数学表达式
\[
\begin{split}\begin{aligned} \boldsymbol{I}_t &= \sigma(\boldsymbol{X}_t \boldsymbol{W}_{xi} + \boldsymbol{H}_{t-1} \boldsymbol{W}_{hi} + \boldsymbol{b}_i),\\
\boldsymbol{F}_t &= \sigma(\boldsymbol{X}_t \boldsymbol{W}_{xf} + \boldsymbol{H}_{t-1} \boldsymbol{W}_{hf} + \boldsymbol{b}_f),\\
\boldsymbol{O}_t &= \sigma(\boldsymbol{X}_t \boldsymbol{W}_{xo} + \boldsymbol{H}_{t-1} \boldsymbol{W}_{ho} + \boldsymbol{b}_o), \end{aligned}\end{split}
\]
\[
\tilde{\boldsymbol{C}}_t = \text{tanh}(\boldsymbol{X}_t \boldsymbol{W}_{xc} + \boldsymbol{H}_{t-1} \boldsymbol{W}_{hc} + \boldsymbol{b}_c), \\boldsymbol{C}_t = \boldsymbol{F}_t \odot \boldsymbol{C}_{t-1} + \boldsymbol{I}_t \odot \tilde{\boldsymbol{C}}_t, \\boldsymbol{H}_t = \boldsymbol{O}_t \odot \text{tanh}(\boldsymbol{C}_t).
\]
2.2结构
- 遗忘门(\(\boldsymbol{F}_t\)):控制上一时间步的记忆细胞
- 输入门(\(\boldsymbol{I}_t\)):控制当前时间步的输入
- 输出门(\(\boldsymbol{O}_t\)):控制从记忆细胞到隐藏状态
- 记忆细胞(候选记忆细胞——\(\tilde{\boldsymbol{C}}_t\),记忆细胞——\(\boldsymbol{C}_t\)):?种特殊的隐藏状态的信息的流动
2.3实现
- 官方实现:https://pytorch.org/docs/1.3.0/nn.html#lstm
- 手写实现:
3.Deep RNN
3.1数学表达式
\[
\boldsymbol{H}_t^{(1)} = \phi(\boldsymbol{X}_t \boldsymbol{W}_{xh}^{(1)} + \boldsymbol{H}_{t-1}^{(1)} \boldsymbol{W}_{hh}^{(1)} + \boldsymbol{b}_h^{(1)})\\boldsymbol{H}_t^{(\ell)} = \phi(\boldsymbol{H}_t^{(\ell-1)} \boldsymbol{W}_{xh}^{(\ell)} + \boldsymbol{H}_{t-1}^{(\ell)} \boldsymbol{W}_{hh}^{(\ell)} + \boldsymbol{b}_h^{(\ell)})\\boldsymbol{O}_t = \boldsymbol{H}_t^{(L)} \boldsymbol{W}_{hq} + \boldsymbol{b}_q
\]
3.2结构
4.Bidirection RNN
4.1数学表达式
\[
\begin{aligned} \overrightarrow{\boldsymbol{H}}_t &= \phi(\boldsymbol{X}_t \boldsymbol{W}_{xh}^{(f)} + \overrightarrow{\boldsymbol{H}}_{t-1} \boldsymbol{W}_{hh}^{(f)} + \boldsymbol{b}_h^{(f)})\\overleftarrow{\boldsymbol{H}}_t &= \phi(\boldsymbol{X}_t \boldsymbol{W}_{xh}^{(b)} + \overleftarrow{\boldsymbol{H}}_{t+1} \boldsymbol{W}_{hh}^{(b)} + \boldsymbol{b}_h^{(b)}) \end{aligned} \]
\[
\boldsymbol{H}_t=(\overrightarrow{\boldsymbol{H}}_{t}, \overleftarrow{\boldsymbol{H}}_t)
\]
\[
\boldsymbol{O}_t = \boldsymbol{H}_t \boldsymbol{W}_{hq} + \boldsymbol{b}_q
\]
4.2结构
原文地址:https://www.cnblogs.com/54hys/p/12311202.html