Skip to content

Latest commit

 

History

History
64 lines (29 loc) · 1.52 KB

mnn.layer.ReluLayer.md

File metadata and controls

64 lines (29 loc) · 1.52 KB

class ReluLayer

method __init__

__init__()

method backward

backward(gradients)

$$ \begin{aligned} \nabla_x \ell =& \nabla_f \ell \cdot \begin{bmatrix} f_1'(x_1) & 0 & ... & 0 \\ 0 & f_2'(x_2) & ... & 0 \\ \ddots \\ 0 & 0 & ... & f_n'(x_n) \end{bmatrix} \\ =& \nabla_f \ell \odot \begin{bmatrix} f_1'(x_1) & f_2'(x_2) & ... & f_n'(x_n) \end{bmatrix}^T \\ =& \nabla_f \ell \odot \nabla_x f \end{aligned} $$


method forward

forward(inputs, feedbacks=None)

Relu activation function:

$$ f_i(x) = \left \{ \begin{aligned} x && (x \ge 0) \\ 0 && (\text{otherwise}) \end{aligned} \right. $$


method step

step(lr=0.01)