AI Lab 12 Lab Tasks - 39
AI Lab 12 Lab Tasks - 39
A- Outcomes:
After completion of the lab session students will be able:
a. To understand shallow neural networks
b. To understand convolutional neural networks
c. To understand the visualization of convolutions and pooling
1
Lab Session -12
F21BETEN1M01039
B- Lab Tasks:
1- Try editing the convolutions. Change the 32s to either 16 or 64. What impact will this
have on accuracy and/or training time.
Write/copy your code here:
Code:
import tensorflow as tf
2
Lab Session -12
Output:
Code:
import tensorflow as tf
3
Lab Session -12
tf.keras.layers.MaxPooling2D(2, 2),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.Dense(10, activation='softmax')
])
Output:
Code:
import tensorflow as tf
4
Lab Session -12
Output:
5
Lab Session -12
2- Remove the final convolution. What impact will this have on accuracy or training
time?
Write/copy your code here:
Code:
import tensorflow as tf
6
Lab Session -12
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.Dense(10, activation='softmax')
])
Output:
3- How about adding more convolutions? What impact do you think this will have?
Experiment with it.
Write/copy your code here:
Code:
import tensorflow as tf
7
Lab Session -12
fmnist = tf.keras.datasets.fashion_mnist
(training_images, training_labels), (test_images, test_labels) =
fmnist.load_data()
Output:
8
Lab Session -12
4- Remove all convolutions but the first. What impact do you think this will have?
Experiment with it.
Write/copy your code here:
Code:
import tensorflow as tf
model = tf.keras.models.Sequential([
tf.keras.layers.Conv2D(16, (3, 3), activation='relu',
input_shape=(28, 28, 1)),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.Dense(10, activation='softmax')
])
model.compile(optimizer='adam', loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
print('\nModel Training')
model.fit(training_images, training_labels, epochs=5)
print('\nModel Evaluation')
test_loss = model.evaluate(test_images, test_labels)
print(f'Test Loss: {test_loss}')
Output:
9
Lab Session -12
5- In the previous lab you implemented a callback to check on the loss function and to
cancel training once it hit a certain amount. Implement that here.
Write/copy your code here:
Code:
import tensorflow as tf
class MyCallback(tf.keras.callbacks.Callback):
def on_epoch_end(self, epoch, logs={}):
if logs.get('loss') < 0.4:
print("\nLoss is low so cancelling training!")
self.model.stop_training = True
10
Lab Session -12
11
Lab Session-12
12