Loss Function
We use binary log loss (cross entropy).
$$ Loss = \frac{1}{N} \sum_{i=1}^N -{(y_i\log(p_i) + (1 - y_i)\log(1 - p_i))} $$
Remember: Here the log is natual-log ($ln$), because the exponential $e$ should match $ln$.
Forward
From the Neural Network, We using forward function to find the predict value and Loss value.
Predict Value: The value we using the input and weight to calculate the predict for output.
Loss Value: We using the Predict Value and the Actual Value into Loss Function to get the Loss Value for current prediction.
Backward
We using the Neural Network, start from the Loss value backward, and use Derivative for each Gate go back to change the value on each weight.
$$ \frac{\partial y}{\partial x} = \frac{\partial y}{\partial y} \cdot \frac{\partial y}{\partial f} \cdot \frac{\partial f}{\partial g} \cdot \frac{\partial g}{\partial x}$$
Code Example
The input value is Variable, the weight value and learning rate we put into Parameter.
1 | # y = a * x1 + b |
2019-05-11