First time here? Checkout the FAQ!
x
+2 votes
4.2k views
asked in Machine Learning by (116k points)  

For the below neural network, imagine we are going to use the backpropagation algorithm to update weights. If the Bias (b) in this problem is always 0 (ignore bias when you solve the problem), and we have a dataset with only one record of $x=2$ and the target value of $y=5$ as you can see in the following table, and activation function is defined as $f(z) = z$

feature (x) Target (y)
2 5

 

1) Define the cost function, $J(w)$, based on the error in backpropagation algorithm: $J(w) = E = \frac{1}{2}(predicted - target)^2$, and draw it

2) Initialize the weight by $w=3$, and calculate the error

3) Calculate updated weights using the gradient decent algorithm after three updates if we have the following values for learning rate ($\alpha$)

  • $\alpha$ = 1
  • $\alpha$ = 0.1
  • $\alpha$ = 0.5

Hint:   $w_{new} = w_{old} - \alpha \frac{\partial E}{\partial w}$ 

  

1 Answer

0 votes
answered by (116k points)  
 
Best answer

commented by (220 points)  
why are you subbing wNEW into the original equation for weights that are 0.1 and 0.5 but not for 1?
commented by (550 points)  
if we follow the same procedure
the error value be very large
in that case
E1=4.5
E2=40.5
E3=364.4
commented by (100 points)  
Isn't the derivative in 3) wrong? I got 2w-5 instead of 4w-10? I think person in solution forgot to incorporate the 1/2
...