First time here? Checkout the FAQ!
x
+3 votes
4.2k views
asked in Machine Learning by (116k points)  
edited by

Assume we have the following neural network and all activation functions are $f(z)=z$. If the weights are initialized with the values you see in table below, what will be new updated weights after one step if learning rate, $\alpha = 0.05$?

Assume the input values are [$i_1$,$i_2$] = [2,3] and target value $out = 1$.

Hint:
$w_{new} = w_{old} - \alpha \frac{\partial E}{\partial w}$

$E_{\text {total}}=\sum \frac{1}{2}(\text {target}-\text {output})^{2}$

Updating weights in backpropagation algorithm
Weights Initialization New weights after one step
$w1$ 0.11 ?
$w2$ 0.21 ?
$w3$ 0.12 ?
$w4$ 0.08 ?
$w5$ 0.14 ?
$w6$ 0.15 ?

  
commented by (100 points)  
+1
w6 = 0.15 and not 0.1 with respect to the posted solution.
commented by (160 points)  
edited by
+1
i1, i2, and target value  should be given! or else not possible to update w1,w2,w3, & w4.
commented by (116k points)  
Thanks for your note Yoga. It is added.

1 Answer

+1 vote
answered by (116k points)  
 
Best answer

A detailed solution is provided in this URL.

The updated weights are as follows:

commented by (280 points)  
In this solution, they solve using Error = 1/2 (output - target)^2, however the question above gives a hint Error = 1/2 (target - output)^2

Although, I don't think it makes a difference to our solution, as the delta is squared in calculating the error, so the sign goes away, and in the derivative, the minuses would cancel out in chain rule.
commented by (100 points)  
It does make a difference, because after taking the derivative it should be (target - output) but in the solution it's (output - target)
...