For the below neural network, imagine we are going to use the backpropagation algorithm to update weights. If the Bias (b) in this problem is always 0 (ignore bias when you solve the problem), and we have a dataset with only one record of $x=2$ and the target value of $y=5$ as you can see in the following table, and activation function is defined as $f(z) = z$
feature (x) |
Target (y) |
2 |
5 |
1) Define the cost function, $J(w)$, based on the error in backpropagation algorithm: $J(w) = E = \frac{1}{2}(predicted - target)^2$, and draw it
2) Initialize the weight by $w=3$, and calculate the error
3) Calculate updated weights using the gradient decent algorithm after three updates if we have the following values for learning rate ($\alpha$)
- $\alpha$ = 1
- $\alpha$ = 0.1
- $\alpha$ = 0.5
Hint: $w_{new} = w_{old} - \alpha \frac{\partial E}{\partial w}$