CHAPTER 3.3 - Activation - Loss - Accuracy
CHAPTER 3.3 - Activation - Loss - Accuracy
in Python LANGUAGE
class Activation_Linear:
# Forward Pass
def forward(self,inputs):
#Calculate output values from input
self.output = inputs
class Activation_Sigmoid:
# Forward Pass
def forward(self,inputs):
#Calculate output values from input
self.output = 1 / (1 + np.exp(-inputs))
class Activation_ReLU:
#Forward Pass
def forward(self,inputs):
#Calculate output values from input
self.output = np.maximum(0,inputs)
the probabilities of the output neurons. exp_values = np.exp(inputs - np.max(inputs, axis = 1,keepdims=True))
self.exp_values = exp_values
The sum of the probabilities is equal to 1
𝑦: Actual distribution
CCE Loss function: 𝐿 = 𝑦𝑖 log 𝑦ො𝑖
ො Predicted distribution
𝑦:
𝑖
Example: 𝑦= 1 0 0
𝑦ොsoftmax = 0,7 0,2 0,1
For the classification problem, one simple #Calculate max values indices along 2nd the samples
predictions = np.argmax(softmax_outputs, axis = 1)
way to compute the accuracy is to compare
print(predictions)
the softmax output vector (using argmax
function) with the label vector. An example if (len(class_targets.shape) == 2):
class_targets = np.argmax(class_targets,axis=1)
of code is as follow:
# Calculate the accuracy (True = 1, False = 0)
accuracy = np.mean(predictions == class_targets)
print("Accuracy: ", accuracy)