2024 GR5245 HW3 - Due1027 - 11pm
2024 GR5245 HW3 - Due1027 - 11pm
Homework Assignment 3
HOMEWORK GUIDELINE
Submit your completed homework as an html file onto CourseWorks by the specified due date and time.
You can find instructions in GR5245_HwkSubmit.ipynb regarding how a Jupyter notebook (.ipynb file)
can be converted into an html file. Before you convert your file, please ensure that all lines of code have
been executed in order to show the desired results. The TA will grade your homework based on what’re
shown in the html file and will not attempt to re-execute the lines of code in your homework.
Note: If you suspect there are typos in this homework, or some questions are wrong, please feel free to
email the instructor.
QUESTION 1
In this question we will investigate the effects of using batch normalization and data augmentation in
convolutional neural networks for image classification.
We would use CIFAR-10 dataset (https://www.cs.toronto.edu/~kriz/cifar.html ) in this investigation.
Typically the testing set is used to evaluate the performance of a trained model. Here we would use the
testing set as the validation set to examine the effects of regularization techniques.
(a) Download the CIFAR-10 training and testing datasets from torchvision.datasets. Use
methods in transforms.v2 to convert the images to tenors with values between 0 and 1. The
testing set will be used for validation.
(b) Create data loaders for training and validation so that minibatches of 128 training examples will be
used in the training loop. Pick your own batch size for validation set.
(c) Display the first 10 images to get a feel for them.
(d) Define a function for training a model for one epoch and a function for evaluating a model.
def train_one_epoch(dataloader, model, loss_fn, optimizer):
model.train()
∙∙∙
return train_loss
def evaluate(dataloader, model, loss_fn):
model.eval()
∙∙∙
return eval_loss, eval_accuracy
For the following experiments, we would:
• use torch.nn.CrossEntropyLoss as the loss function
• use torch.optim.Adam with default parameter values for the optimization steps