Back Propagation Concept: Where, y Actual Output and Calculated Output
Back Propagation Concept: Where, y Actual Output and Calculated Output
1
E= * Err 2
2
1
E= * ( yh(x))2
2
xj
wj
E
wj wj
+* W j
E 1
= ( * Err 2 )
W j W j 2
( yg w j x j )
= Err
*(
( j ) ),
W j
g w j x j
= Err
*( W j j
( ) )
= Err
* g
'
* ( w x )
j
j j
* xj
E
wj wj
Therefore, old
+* W j
= wj old
+ *[- Err
* g' * ( w x )
j
j j
* xj
Let, j= - Err
* g' * ( w x )
j
j j
Where, inj =
wj x j
j
wj wj xj
new
= old
+ * j*
Back Propagation Algorithm
1. Initialize all the weights by small random values.
2. Find Output using feed forward network.
i.e. y= g*
wj x j
j
4. Update
w jk =k z k
th
The delta term for j hidden unit
j= injg'( z inj )
7. Update
w jk =w jk w jk
w ij =w ij w ij
Since output is not 0, we calculate error for each unit and update
weight.
Taking 0.2,
w13=0.2*(-0.143)*0.57 =-0.0163
w12=0.2*(-0.143)*0.43 =-0.012
Now,
w13(new)=(-0.2)+w13=-0.2163
w12(new)=(0.3)+w12=0.288
Repeating similar steps, until (calculated output - expected output)<0.0001(i.e. error bound)