Deep Learning Week 201-2
Deep Learning Week 201-2
Cost function: 𝐽 𝑥, 𝑊, 𝑏, 𝑦 = 𝐽 𝑦, ( 𝑦 → It measures the difference between the predicted output 𝑦! (which is a function of the input 𝑥, weights 𝑊, and biases 𝑏)
Examples and the true output 𝑦, providing a quantitative measure of how well the network is performing.
of cost Cross-entropy Cost Function: 𝐽!" 𝑦, ( 𝑦 = ∑& (#)
#$% 𝑦 𝑙𝑜𝑔𝑦 ( (#) →It measures the dissimilarity between the predicted probabilities 𝑦! (") and the true
functions labels 𝑦 (") across all 𝑚 training examples, with the goal of minimizing this cost during training to improve the network's performance.
( 𝑦 = ∑&
Mean Absolute Error (MAE) cost function: 𝐽2 𝑦, #$% 𝑦
(#)
− 𝑦( (#) →It calculates the average absolute difference between the predicted
values 𝑦! (") and the actual values 𝑦 (") across all 𝑚 examples. MAE equally penalizes all errors, making it less sensitive to outliers compared to Mean Squared Error (MSE).