0% found this document useful (0 votes)
6 views29 pages

XRAY

This document contains the output of running a Python script that trains a convolutional neural network model on image data. It shows information about the dataset, model architecture including layers and parameters, and training configuration.

Uploaded by

sowmi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views29 pages

XRAY

This document contains the output of running a Python script that trains a convolutional neural network model on image data. It shows information about the dataset, model architecture including layers and parameters, and training configuration.

Uploaded by

sowmi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 29

C:\Users\benin\anaconda3\envs\ultralytics\python.

exe "G:\project code january\osteo_artritis\


1000 dens.py"

Found 3586 validated image filenames belonging to 5 classes.

Found 399 validated image filenames belonging to 5 classes.

Found 1656 validated image filenames belonging to 5 classes.

2024-03-27 13:03:29.655293: I tensorflow/core/platform/cpu_feature_guard.cc:193] This


TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the
following CPU instructions in performance-critical operations: AVX AVX2

To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.

WARNING:absl:`lr` is deprecated, please use `learning_rate` instead, or use the legacy optimizer,
e.g.,tf.keras.optimizers.legacy.Adam.

Model: "model"

__________________________________________________________________________________
________________

Layer (type) Output Shape Param # Connected to

==================================================================================
================

input_1 (InputLayer) [(None, 224, 224, 3 0 []

)]

conv2d (Conv2D) (None, 111, 111, 32 864 ['input_1[0][0]']

batch_normalization (BatchNorm (None, 111, 111, 32 96 ['conv2d[0][0]']

alization) )

activation (Activation) (None, 111, 111, 32 0 ['batch_normalization[0][0]']

conv2d_1 (Conv2D) (None, 109, 109, 32 9216 ['activation[0][0]']

batch_normalization_1 (BatchNo (None, 109, 109, 32 96 ['conv2d_1[0][0]']


rmalization) )

activation_1 (Activation) (None, 109, 109, 32 0 ['batch_normalization_1[0][0]']

conv2d_2 (Conv2D) (None, 109, 109, 64 18432 ['activation_1[0][0]']

batch_normalization_2 (BatchNo (None, 109, 109, 64 192 ['conv2d_2[0][0]']

rmalization) )

activation_2 (Activation) (None, 109, 109, 64 0 ['batch_normalization_2[0][0]']

max_pooling2d (MaxPooling2D) (None, 54, 54, 64) 0 ['activation_2[0][0]']

conv2d_3 (Conv2D) (None, 54, 54, 80) 5120 ['max_pooling2d[0][0]']

batch_normalization_3 (BatchNo (None, 54, 54, 80) 240 ['conv2d_3[0][0]']

rmalization)

activation_3 (Activation) (None, 54, 54, 80) 0 ['batch_normalization_3[0][0]']

conv2d_4 (Conv2D) (None, 52, 52, 192) 138240 ['activation_3[0][0]']

batch_normalization_4 (BatchNo (None, 52, 52, 192) 576 ['conv2d_4[0][0]']

rmalization)

activation_4 (Activation) (None, 52, 52, 192) 0 ['batch_normalization_4[0][0]']

max_pooling2d_1 (MaxPooling2D) (None, 25, 25, 192) 0 ['activation_4[0][0]']


conv2d_8 (Conv2D) (None, 25, 25, 64) 12288 ['max_pooling2d_1[0][0]']

batch_normalization_8 (BatchNo (None, 25, 25, 64) 192 ['conv2d_8[0][0]']

rmalization)

activation_8 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_8[0][0]']

conv2d_6 (Conv2D) (None, 25, 25, 48) 9216 ['max_pooling2d_1[0][0]']

conv2d_9 (Conv2D) (None, 25, 25, 96) 55296 ['activation_8[0][0]']

batch_normalization_6 (BatchNo (None, 25, 25, 48) 144 ['conv2d_6[0][0]']

rmalization)

batch_normalization_9 (BatchNo (None, 25, 25, 96) 288 ['conv2d_9[0][0]']

rmalization)

activation_6 (Activation) (None, 25, 25, 48) 0 ['batch_normalization_6[0][0]']

activation_9 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_9[0][0]']

average_pooling2d (AveragePool (None, 25, 25, 192) 0 ['max_pooling2d_1[0][0]']

ing2D)

conv2d_5 (Conv2D) (None, 25, 25, 64) 12288 ['max_pooling2d_1[0][0]']

conv2d_7 (Conv2D) (None, 25, 25, 64) 76800 ['activation_6[0][0]']

conv2d_10 (Conv2D) (None, 25, 25, 96) 82944 ['activation_9[0][0]']


conv2d_11 (Conv2D) (None, 25, 25, 32) 6144 ['average_pooling2d[0][0]']

batch_normalization_5 (BatchNo (None, 25, 25, 64) 192 ['conv2d_5[0][0]']

rmalization)

batch_normalization_7 (BatchNo (None, 25, 25, 64) 192 ['conv2d_7[0][0]']

rmalization)

batch_normalization_10 (BatchN (None, 25, 25, 96) 288 ['conv2d_10[0][0]']

ormalization)

batch_normalization_11 (BatchN (None, 25, 25, 32) 96 ['conv2d_11[0][0]']

ormalization)

activation_5 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_5[0][0]']

activation_7 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_7[0][0]']

activation_10 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_10[0][0]']

activation_11 (Activation) (None, 25, 25, 32) 0 ['batch_normalization_11[0][0]']

mixed0 (Concatenate) (None, 25, 25, 256) 0 ['activation_5[0][0]',

'activation_7[0][0]',

'activation_10[0][0]',

'activation_11[0][0]']

conv2d_15 (Conv2D) (None, 25, 25, 64) 16384 ['mixed0[0][0]']

batch_normalization_15 (BatchN (None, 25, 25, 64) 192 ['conv2d_15[0][0]']

ormalization)
activation_15 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_15[0][0]']

conv2d_13 (Conv2D) (None, 25, 25, 48) 12288 ['mixed0[0][0]']

conv2d_16 (Conv2D) (None, 25, 25, 96) 55296 ['activation_15[0][0]']

batch_normalization_13 (BatchN (None, 25, 25, 48) 144 ['conv2d_13[0][0]']

ormalization)

batch_normalization_16 (BatchN (None, 25, 25, 96) 288 ['conv2d_16[0][0]']

ormalization)

activation_13 (Activation) (None, 25, 25, 48) 0 ['batch_normalization_13[0][0]']

activation_16 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_16[0][0]']

average_pooling2d_1 (AveragePo (None, 25, 25, 256) 0 ['mixed0[0][0]']

oling2D)

conv2d_12 (Conv2D) (None, 25, 25, 64) 16384 ['mixed0[0][0]']

conv2d_14 (Conv2D) (None, 25, 25, 64) 76800 ['activation_13[0][0]']

conv2d_17 (Conv2D) (None, 25, 25, 96) 82944 ['activation_16[0][0]']

conv2d_18 (Conv2D) (None, 25, 25, 64) 16384 ['average_pooling2d_1[0][0]']

batch_normalization_12 (BatchN (None, 25, 25, 64) 192 ['conv2d_12[0][0]']

ormalization)
batch_normalization_14 (BatchN (None, 25, 25, 64) 192 ['conv2d_14[0][0]']

ormalization)

batch_normalization_17 (BatchN (None, 25, 25, 96) 288 ['conv2d_17[0][0]']

ormalization)

batch_normalization_18 (BatchN (None, 25, 25, 64) 192 ['conv2d_18[0][0]']

ormalization)

activation_12 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_12[0][0]']

activation_14 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_14[0][0]']

activation_17 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_17[0][0]']

activation_18 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_18[0][0]']

mixed1 (Concatenate) (None, 25, 25, 288) 0 ['activation_12[0][0]',

'activation_14[0][0]',

'activation_17[0][0]',

'activation_18[0][0]']

conv2d_22 (Conv2D) (None, 25, 25, 64) 18432 ['mixed1[0][0]']

batch_normalization_22 (BatchN (None, 25, 25, 64) 192 ['conv2d_22[0][0]']

ormalization)

activation_22 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_22[0][0]']

conv2d_20 (Conv2D) (None, 25, 25, 48) 13824 ['mixed1[0][0]']


conv2d_23 (Conv2D) (None, 25, 25, 96) 55296 ['activation_22[0][0]']

batch_normalization_20 (BatchN (None, 25, 25, 48) 144 ['conv2d_20[0][0]']

ormalization)

batch_normalization_23 (BatchN (None, 25, 25, 96) 288 ['conv2d_23[0][0]']

ormalization)

activation_20 (Activation) (None, 25, 25, 48) 0 ['batch_normalization_20[0][0]']

activation_23 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_23[0][0]']

average_pooling2d_2 (AveragePo (None, 25, 25, 288) 0 ['mixed1[0][0]']

oling2D)

conv2d_19 (Conv2D) (None, 25, 25, 64) 18432 ['mixed1[0][0]']

conv2d_21 (Conv2D) (None, 25, 25, 64) 76800 ['activation_20[0][0]']

conv2d_24 (Conv2D) (None, 25, 25, 96) 82944 ['activation_23[0][0]']

conv2d_25 (Conv2D) (None, 25, 25, 64) 18432 ['average_pooling2d_2[0][0]']

batch_normalization_19 (BatchN (None, 25, 25, 64) 192 ['conv2d_19[0][0]']

ormalization)

batch_normalization_21 (BatchN (None, 25, 25, 64) 192 ['conv2d_21[0][0]']

ormalization)

batch_normalization_24 (BatchN (None, 25, 25, 96) 288 ['conv2d_24[0][0]']

ormalization)
batch_normalization_25 (BatchN (None, 25, 25, 64) 192 ['conv2d_25[0][0]']

ormalization)

activation_19 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_19[0][0]']

activation_21 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_21[0][0]']

activation_24 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_24[0][0]']

activation_25 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_25[0][0]']

mixed2 (Concatenate) (None, 25, 25, 288) 0 ['activation_19[0][0]',

'activation_21[0][0]',

'activation_24[0][0]',

'activation_25[0][0]']

conv2d_27 (Conv2D) (None, 25, 25, 64) 18432 ['mixed2[0][0]']

batch_normalization_27 (BatchN (None, 25, 25, 64) 192 ['conv2d_27[0][0]']

ormalization)

activation_27 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_27[0][0]']

conv2d_28 (Conv2D) (None, 25, 25, 96) 55296 ['activation_27[0][0]']

batch_normalization_28 (BatchN (None, 25, 25, 96) 288 ['conv2d_28[0][0]']

ormalization)

activation_28 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_28[0][0]']


conv2d_26 (Conv2D) (None, 12, 12, 384) 995328 ['mixed2[0][0]']

conv2d_29 (Conv2D) (None, 12, 12, 96) 82944 ['activation_28[0][0]']

batch_normalization_26 (BatchN (None, 12, 12, 384) 1152 ['conv2d_26[0][0]']

ormalization)

batch_normalization_29 (BatchN (None, 12, 12, 96) 288 ['conv2d_29[0][0]']

ormalization)

activation_26 (Activation) (None, 12, 12, 384) 0 ['batch_normalization_26[0][0]']

activation_29 (Activation) (None, 12, 12, 96) 0 ['batch_normalization_29[0][0]']

max_pooling2d_2 (MaxPooling2D) (None, 12, 12, 288) 0 ['mixed2[0][0]']

mixed3 (Concatenate) (None, 12, 12, 768) 0 ['activation_26[0][0]',

'activation_29[0][0]',

'max_pooling2d_2[0][0]']

conv2d_34 (Conv2D) (None, 12, 12, 128) 98304 ['mixed3[0][0]']

batch_normalization_34 (BatchN (None, 12, 12, 128) 384 ['conv2d_34[0][0]']

ormalization)

activation_34 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_34[0][0]']

conv2d_35 (Conv2D) (None, 12, 12, 128) 114688 ['activation_34[0][0]']

batch_normalization_35 (BatchN (None, 12, 12, 128) 384 ['conv2d_35[0][0]']

ormalization)
activation_35 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_35[0][0]']

conv2d_31 (Conv2D) (None, 12, 12, 128) 98304 ['mixed3[0][0]']

conv2d_36 (Conv2D) (None, 12, 12, 128) 114688 ['activation_35[0][0]']

batch_normalization_31 (BatchN (None, 12, 12, 128) 384 ['conv2d_31[0][0]']

ormalization)

batch_normalization_36 (BatchN (None, 12, 12, 128) 384 ['conv2d_36[0][0]']

ormalization)

activation_31 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_31[0][0]']

activation_36 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_36[0][0]']

conv2d_32 (Conv2D) (None, 12, 12, 128) 114688 ['activation_31[0][0]']

conv2d_37 (Conv2D) (None, 12, 12, 128) 114688 ['activation_36[0][0]']

batch_normalization_32 (BatchN (None, 12, 12, 128) 384 ['conv2d_32[0][0]']

ormalization)

batch_normalization_37 (BatchN (None, 12, 12, 128) 384 ['conv2d_37[0][0]']

ormalization)

activation_32 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_32[0][0]']

activation_37 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_37[0][0]']


average_pooling2d_3 (AveragePo (None, 12, 12, 768) 0 ['mixed3[0][0]']

oling2D)

conv2d_30 (Conv2D) (None, 12, 12, 192) 147456 ['mixed3[0][0]']

conv2d_33 (Conv2D) (None, 12, 12, 192) 172032 ['activation_32[0][0]']

conv2d_38 (Conv2D) (None, 12, 12, 192) 172032 ['activation_37[0][0]']

conv2d_39 (Conv2D) (None, 12, 12, 192) 147456 ['average_pooling2d_3[0][0]']

batch_normalization_30 (BatchN (None, 12, 12, 192) 576 ['conv2d_30[0][0]']

ormalization)

batch_normalization_33 (BatchN (None, 12, 12, 192) 576 ['conv2d_33[0][0]']

ormalization)

batch_normalization_38 (BatchN (None, 12, 12, 192) 576 ['conv2d_38[0][0]']

ormalization)

batch_normalization_39 (BatchN (None, 12, 12, 192) 576 ['conv2d_39[0][0]']

ormalization)

activation_30 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_30[0][0]']

activation_33 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_33[0][0]']

activation_38 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_38[0][0]']

activation_39 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_39[0][0]']


mixed4 (Concatenate) (None, 12, 12, 768) 0 ['activation_30[0][0]',

'activation_33[0][0]',

'activation_38[0][0]',

'activation_39[0][0]']

conv2d_44 (Conv2D) (None, 12, 12, 160) 122880 ['mixed4[0][0]']

batch_normalization_44 (BatchN (None, 12, 12, 160) 480 ['conv2d_44[0][0]']

ormalization)

activation_44 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_44[0][0]']

conv2d_45 (Conv2D) (None, 12, 12, 160) 179200 ['activation_44[0][0]']

batch_normalization_45 (BatchN (None, 12, 12, 160) 480 ['conv2d_45[0][0]']

ormalization)

activation_45 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_45[0][0]']

conv2d_41 (Conv2D) (None, 12, 12, 160) 122880 ['mixed4[0][0]']

conv2d_46 (Conv2D) (None, 12, 12, 160) 179200 ['activation_45[0][0]']

batch_normalization_41 (BatchN (None, 12, 12, 160) 480 ['conv2d_41[0][0]']

ormalization)

batch_normalization_46 (BatchN (None, 12, 12, 160) 480 ['conv2d_46[0][0]']

ormalization)

activation_41 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_41[0][0]']


activation_46 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_46[0][0]']

conv2d_42 (Conv2D) (None, 12, 12, 160) 179200 ['activation_41[0][0]']

conv2d_47 (Conv2D) (None, 12, 12, 160) 179200 ['activation_46[0][0]']

batch_normalization_42 (BatchN (None, 12, 12, 160) 480 ['conv2d_42[0][0]']

ormalization)

batch_normalization_47 (BatchN (None, 12, 12, 160) 480 ['conv2d_47[0][0]']

ormalization)

activation_42 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_42[0][0]']

activation_47 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_47[0][0]']

average_pooling2d_4 (AveragePo (None, 12, 12, 768) 0 ['mixed4[0][0]']

oling2D)

conv2d_40 (Conv2D) (None, 12, 12, 192) 147456 ['mixed4[0][0]']

conv2d_43 (Conv2D) (None, 12, 12, 192) 215040 ['activation_42[0][0]']

conv2d_48 (Conv2D) (None, 12, 12, 192) 215040 ['activation_47[0][0]']

conv2d_49 (Conv2D) (None, 12, 12, 192) 147456 ['average_pooling2d_4[0][0]']

batch_normalization_40 (BatchN (None, 12, 12, 192) 576 ['conv2d_40[0][0]']

ormalization)

batch_normalization_43 (BatchN (None, 12, 12, 192) 576 ['conv2d_43[0][0]']


ormalization)

batch_normalization_48 (BatchN (None, 12, 12, 192) 576 ['conv2d_48[0][0]']

ormalization)

batch_normalization_49 (BatchN (None, 12, 12, 192) 576 ['conv2d_49[0][0]']

ormalization)

activation_40 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_40[0][0]']

activation_43 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_43[0][0]']

activation_48 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_48[0][0]']

activation_49 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_49[0][0]']

mixed5 (Concatenate) (None, 12, 12, 768) 0 ['activation_40[0][0]',

'activation_43[0][0]',

'activation_48[0][0]',

'activation_49[0][0]']

conv2d_54 (Conv2D) (None, 12, 12, 160) 122880 ['mixed5[0][0]']

batch_normalization_54 (BatchN (None, 12, 12, 160) 480 ['conv2d_54[0][0]']

ormalization)

activation_54 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_54[0][0]']

conv2d_55 (Conv2D) (None, 12, 12, 160) 179200 ['activation_54[0][0]']

batch_normalization_55 (BatchN (None, 12, 12, 160) 480 ['conv2d_55[0][0]']


ormalization)

activation_55 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_55[0][0]']

conv2d_51 (Conv2D) (None, 12, 12, 160) 122880 ['mixed5[0][0]']

conv2d_56 (Conv2D) (None, 12, 12, 160) 179200 ['activation_55[0][0]']

batch_normalization_51 (BatchN (None, 12, 12, 160) 480 ['conv2d_51[0][0]']

ormalization)

batch_normalization_56 (BatchN (None, 12, 12, 160) 480 ['conv2d_56[0][0]']

ormalization)

activation_51 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_51[0][0]']

activation_56 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_56[0][0]']

conv2d_52 (Conv2D) (None, 12, 12, 160) 179200 ['activation_51[0][0]']

conv2d_57 (Conv2D) (None, 12, 12, 160) 179200 ['activation_56[0][0]']

batch_normalization_52 (BatchN (None, 12, 12, 160) 480 ['conv2d_52[0][0]']

ormalization)

batch_normalization_57 (BatchN (None, 12, 12, 160) 480 ['conv2d_57[0][0]']

ormalization)

activation_52 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_52[0][0]']

activation_57 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_57[0][0]']


average_pooling2d_5 (AveragePo (None, 12, 12, 768) 0 ['mixed5[0][0]']

oling2D)

conv2d_50 (Conv2D) (None, 12, 12, 192) 147456 ['mixed5[0][0]']

conv2d_53 (Conv2D) (None, 12, 12, 192) 215040 ['activation_52[0][0]']

conv2d_58 (Conv2D) (None, 12, 12, 192) 215040 ['activation_57[0][0]']

conv2d_59 (Conv2D) (None, 12, 12, 192) 147456 ['average_pooling2d_5[0][0]']

batch_normalization_50 (BatchN (None, 12, 12, 192) 576 ['conv2d_50[0][0]']

ormalization)

batch_normalization_53 (BatchN (None, 12, 12, 192) 576 ['conv2d_53[0][0]']

ormalization)

batch_normalization_58 (BatchN (None, 12, 12, 192) 576 ['conv2d_58[0][0]']

ormalization)

batch_normalization_59 (BatchN (None, 12, 12, 192) 576 ['conv2d_59[0][0]']

ormalization)

activation_50 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_50[0][0]']

activation_53 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_53[0][0]']

activation_58 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_58[0][0]']

activation_59 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_59[0][0]']


mixed6 (Concatenate) (None, 12, 12, 768) 0 ['activation_50[0][0]',

'activation_53[0][0]',

'activation_58[0][0]',

'activation_59[0][0]']

conv2d_64 (Conv2D) (None, 12, 12, 192) 147456 ['mixed6[0][0]']

batch_normalization_64 (BatchN (None, 12, 12, 192) 576 ['conv2d_64[0][0]']

ormalization)

activation_64 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_64[0][0]']

conv2d_65 (Conv2D) (None, 12, 12, 192) 258048 ['activation_64[0][0]']

batch_normalization_65 (BatchN (None, 12, 12, 192) 576 ['conv2d_65[0][0]']

ormalization)

activation_65 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_65[0][0]']

conv2d_61 (Conv2D) (None, 12, 12, 192) 147456 ['mixed6[0][0]']

conv2d_66 (Conv2D) (None, 12, 12, 192) 258048 ['activation_65[0][0]']

batch_normalization_61 (BatchN (None, 12, 12, 192) 576 ['conv2d_61[0][0]']

ormalization)

batch_normalization_66 (BatchN (None, 12, 12, 192) 576 ['conv2d_66[0][0]']

ormalization)

activation_61 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_61[0][0]']


activation_66 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_66[0][0]']

conv2d_62 (Conv2D) (None, 12, 12, 192) 258048 ['activation_61[0][0]']

conv2d_67 (Conv2D) (None, 12, 12, 192) 258048 ['activation_66[0][0]']

batch_normalization_62 (BatchN (None, 12, 12, 192) 576 ['conv2d_62[0][0]']

ormalization)

batch_normalization_67 (BatchN (None, 12, 12, 192) 576 ['conv2d_67[0][0]']

ormalization)

activation_62 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_62[0][0]']

activation_67 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_67[0][0]']

average_pooling2d_6 (AveragePo (None, 12, 12, 768) 0 ['mixed6[0][0]']

oling2D)

conv2d_60 (Conv2D) (None, 12, 12, 192) 147456 ['mixed6[0][0]']

conv2d_63 (Conv2D) (None, 12, 12, 192) 258048 ['activation_62[0][0]']

conv2d_68 (Conv2D) (None, 12, 12, 192) 258048 ['activation_67[0][0]']

conv2d_69 (Conv2D) (None, 12, 12, 192) 147456 ['average_pooling2d_6[0][0]']

batch_normalization_60 (BatchN (None, 12, 12, 192) 576 ['conv2d_60[0][0]']

ormalization)
batch_normalization_63 (BatchN (None, 12, 12, 192) 576 ['conv2d_63[0][0]']

ormalization)

batch_normalization_68 (BatchN (None, 12, 12, 192) 576 ['conv2d_68[0][0]']

ormalization)

batch_normalization_69 (BatchN (None, 12, 12, 192) 576 ['conv2d_69[0][0]']

ormalization)

activation_60 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_60[0][0]']

activation_63 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_63[0][0]']

activation_68 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_68[0][0]']

activation_69 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_69[0][0]']

mixed7 (Concatenate) (None, 12, 12, 768) 0 ['activation_60[0][0]',

'activation_63[0][0]',

'activation_68[0][0]',

'activation_69[0][0]']

flatten (Flatten) (None, 110592) 0 ['mixed7[0][0]']

dense (Dense) (None, 512) 56623616 ['flatten[0][0]']

dropout (Dropout) (None, 512) 0 ['dense[0][0]']

dense_1 (Dense) (None, 5) 2565 ['dropout[0][0]']


==================================================================================
================

Total params: 65,601,445

Trainable params: 56,626,181

Non-trainable params: 8,975,264

__________________________________________________________________________________
________________

Epoch 1/100

113/113 [==============================] - 200s 2s/step - loss: 3.0354 - accuracy: 0.2540 -


val_loss: 1.4802 - val_accuracy: 0.2306

Epoch 2/100

113/113 [==============================] - 123s 1s/step - loss: 1.5015 - accuracy: 0.2747 -


val_loss: 1.3900 - val_accuracy: 0.2957

Epoch 3/100

113/113 [==============================] - 126s 1s/step - loss: 1.4537 - accuracy: 0.2817 -


val_loss: 1.4152 - val_accuracy: 0.3033

Epoch 4/100

113/113 [==============================] - 126s 1s/step - loss: 1.4344 - accuracy: 0.2998 -


val_loss: 1.3710 - val_accuracy: 0.2857

Epoch 5/100

113/113 [==============================] - 126s 1s/step - loss: 1.4224 - accuracy: 0.3140 -


val_loss: 1.3499 - val_accuracy: 0.3684

Epoch 6/100

113/113 [==============================] - 127s 1s/step - loss: 1.4184 - accuracy: 0.3212 -


val_loss: 1.3448 - val_accuracy: 0.3659

Epoch 7/100

113/113 [==============================] - 128s 1s/step - loss: 1.3935 - accuracy: 0.3316 -


val_loss: 1.3290 - val_accuracy: 0.3709

Epoch 8/100

113/113 [==============================] - 127s 1s/step - loss: 1.4002 - accuracy: 0.3383 -


val_loss: 1.3310 - val_accuracy: 0.4035

Epoch 9/100

113/113 [==============================] - 128s 1s/step - loss: 1.3902 - accuracy: 0.3313 -


val_loss: 1.3047 - val_accuracy: 0.4110

Epoch 10/100
113/113 [==============================] - 127s 1s/step - loss: 1.3976 - accuracy: 0.3424 -
val_loss: 1.3310 - val_accuracy: 0.3108

Epoch 11/100

113/113 [==============================] - 127s 1s/step - loss: 1.3958 - accuracy: 0.3277 -


val_loss: 1.3152 - val_accuracy: 0.4110

Epoch 12/100

113/113 [==============================] - 127s 1s/step - loss: 1.4147 - accuracy: 0.3193 -


val_loss: 1.3127 - val_accuracy: 0.3985

Epoch 13/100

113/113 [==============================] - 127s 1s/step - loss: 1.4018 - accuracy: 0.3263 -


val_loss: 1.3286 - val_accuracy: 0.3634

Epoch 14/100

113/113 [==============================] - 127s 1s/step - loss: 1.3962 - accuracy: 0.3246 -


val_loss: 1.3129 - val_accuracy: 0.3784

Epoch 15/100

113/113 [==============================] - 128s 1s/step - loss: 1.4071 - accuracy: 0.3352 -


val_loss: 1.3213 - val_accuracy: 0.3684

Epoch 16/100

113/113 [==============================] - 128s 1s/step - loss: 1.3840 - accuracy: 0.3397 -


val_loss: 1.3044 - val_accuracy: 0.3885

Epoch 17/100

113/113 [==============================] - 128s 1s/step - loss: 1.3912 - accuracy: 0.3388 -


val_loss: 1.3166 - val_accuracy: 0.4035

Epoch 18/100

113/113 [==============================] - 128s 1s/step - loss: 1.3994 - accuracy: 0.3313 -


val_loss: 1.3109 - val_accuracy: 0.3609

Epoch 19/100

113/113 [==============================] - 129s 1s/step - loss: 1.3888 - accuracy: 0.3436 -


val_loss: 1.3035 - val_accuracy: 0.4160

Epoch 20/100

113/113 [==============================] - 129s 1s/step - loss: 1.3853 - accuracy: 0.3243 -


val_loss: 1.2910 - val_accuracy: 0.3985

Epoch 21/100

113/113 [==============================] - 122s 1s/step - loss: 1.3821 - accuracy: 0.3363 -


val_loss: 1.3027 - val_accuracy: 0.3960
Epoch 22/100

113/113 [==============================] - 115s 1s/step - loss: 1.3899 - accuracy: 0.3366 -


val_loss: 1.3000 - val_accuracy: 0.3985

Epoch 23/100

113/113 [==============================] - 116s 1s/step - loss: 1.3818 - accuracy: 0.3257 -


val_loss: 1.2984 - val_accuracy: 0.4160

Epoch 24/100

113/113 [==============================] - 116s 1s/step - loss: 1.4052 - accuracy: 0.3210 -


val_loss: 1.3353 - val_accuracy: 0.3759

Epoch 25/100

113/113 [==============================] - 116s 1s/step - loss: 1.4230 - accuracy: 0.3162 -


val_loss: 1.3050 - val_accuracy: 0.3935

Epoch 26/100

113/113 [==============================] - 117s 1s/step - loss: 1.3986 - accuracy: 0.3201 -


val_loss: 1.3172 - val_accuracy: 0.3634

Epoch 27/100

113/113 [==============================] - 116s 1s/step - loss: 1.3827 - accuracy: 0.3366 -


val_loss: 1.2985 - val_accuracy: 0.3810

Epoch 28/100

113/113 [==============================] - 116s 1s/step - loss: 1.4023 - accuracy: 0.3305 -


val_loss: 1.2961 - val_accuracy: 0.3960

Epoch 29/100

113/113 [==============================] - 116s 1s/step - loss: 1.3950 - accuracy: 0.3265 -


val_loss: 1.3011 - val_accuracy: 0.3985

Epoch 30/100

113/113 [==============================] - 116s 1s/step - loss: 1.3825 - accuracy: 0.3232 -


val_loss: 1.3009 - val_accuracy: 0.4085

Epoch 31/100

113/113 [==============================] - 116s 1s/step - loss: 1.3725 - accuracy: 0.3374 -


val_loss: 1.2946 - val_accuracy: 0.4110

Epoch 32/100

113/113 [==============================] - 116s 1s/step - loss: 1.3861 - accuracy: 0.3243 -


val_loss: 1.2844 - val_accuracy: 0.4160

Epoch 33/100
113/113 [==============================] - 117s 1s/step - loss: 1.3720 - accuracy: 0.3405 -
val_loss: 1.2902 - val_accuracy: 0.4035

Epoch 34/100

113/113 [==============================] - 116s 1s/step - loss: 1.3754 - accuracy: 0.3257 -


val_loss: 1.2969 - val_accuracy: 0.3609

Epoch 35/100

113/113 [==============================] - 116s 1s/step - loss: 1.3799 - accuracy: 0.3274 -


val_loss: 1.3148 - val_accuracy: 0.3434

Epoch 36/100

113/113 [==============================] - 116s 1s/step - loss: 1.3868 - accuracy: 0.3436 -


val_loss: 1.3460 - val_accuracy: 0.3058

Epoch 37/100

113/113 [==============================] - 116s 1s/step - loss: 1.3700 - accuracy: 0.3341 -


val_loss: 1.3304 - val_accuracy: 0.3734

Epoch 38/100

113/113 [==============================] - 117s 1s/step - loss: 1.3893 - accuracy: 0.3405 -


val_loss: 1.2819 - val_accuracy: 0.4035

Epoch 39/100

113/113 [==============================] - 116s 1s/step - loss: 1.3852 - accuracy: 0.3402 -


val_loss: 1.2907 - val_accuracy: 0.3985

Epoch 40/100

113/113 [==============================] - 117s 1s/step - loss: 1.3752 - accuracy: 0.3410 -


val_loss: 1.2887 - val_accuracy: 0.3860

Epoch 41/100

113/113 [==============================] - 117s 1s/step - loss: 1.3746 - accuracy: 0.3299 -


val_loss: 1.2920 - val_accuracy: 0.3634

Epoch 42/100

113/113 [==============================] - 117s 1s/step - loss: 1.3749 - accuracy: 0.3377 -


val_loss: 1.2950 - val_accuracy: 0.3308

Epoch 43/100

113/113 [==============================] - 117s 1s/step - loss: 1.3805 - accuracy: 0.3374 -


val_loss: 1.3230 - val_accuracy: 0.3734

Epoch 44/100

113/113 [==============================] - 117s 1s/step - loss: 1.3706 - accuracy: 0.3391 -


val_loss: 1.2968 - val_accuracy: 0.3734
Epoch 45/100

113/113 [==============================] - 116s 1s/step - loss: 1.3860 - accuracy: 0.3260 -


val_loss: 1.2953 - val_accuracy: 0.3810

Epoch 46/100

113/113 [==============================] - 116s 1s/step - loss: 1.3686 - accuracy: 0.3355 -


val_loss: 1.2823 - val_accuracy: 0.4336

Epoch 47/100

113/113 [==============================] - 117s 1s/step - loss: 1.3679 - accuracy: 0.3603 -


val_loss: 1.3111 - val_accuracy: 0.3609

Epoch 48/100

113/113 [==============================] - 117s 1s/step - loss: 1.3738 - accuracy: 0.3542 -


val_loss: 1.2836 - val_accuracy: 0.4436

Epoch 49/100

113/113 [==============================] - 116s 1s/step - loss: 1.3710 - accuracy: 0.3416 -


val_loss: 1.3718 - val_accuracy: 0.3985

Epoch 50/100

113/113 [==============================] - 116s 1s/step - loss: 1.3653 - accuracy: 0.3405 -


val_loss: 1.2825 - val_accuracy: 0.4035

Epoch 51/100

113/113 [==============================] - 117s 1s/step - loss: 1.3711 - accuracy: 0.3480 -


val_loss: 1.2958 - val_accuracy: 0.4085

Epoch 52/100

113/113 [==============================] - 116s 1s/step - loss: 1.3643 - accuracy: 0.3505 -


val_loss: 1.2818 - val_accuracy: 0.4035

Epoch 53/100

113/113 [==============================] - 116s 1s/step - loss: 1.3660 - accuracy: 0.3581 -


val_loss: 1.2929 - val_accuracy: 0.3860

Epoch 54/100

113/113 [==============================] - 116s 1s/step - loss: 1.3806 - accuracy: 0.3399 -


val_loss: 1.2759 - val_accuracy: 0.4336

Epoch 55/100

113/113 [==============================] - 116s 1s/step - loss: 1.3750 - accuracy: 0.3369 -


val_loss: 1.2948 - val_accuracy: 0.4135

Epoch 56/100
113/113 [==============================] - 116s 1s/step - loss: 1.3774 - accuracy: 0.3366 -
val_loss: 1.2872 - val_accuracy: 0.4010

Epoch 57/100

113/113 [==============================] - 116s 1s/step - loss: 1.3632 - accuracy: 0.3430 -


val_loss: 1.2755 - val_accuracy: 0.3985

Epoch 58/100

113/113 [==============================] - 116s 1s/step - loss: 1.3695 - accuracy: 0.3555 -


val_loss: 1.2766 - val_accuracy: 0.4211

Epoch 59/100

113/113 [==============================] - 116s 1s/step - loss: 1.3645 - accuracy: 0.3542 -


val_loss: 1.2826 - val_accuracy: 0.4211

Epoch 60/100

113/113 [==============================] - 116s 1s/step - loss: 1.3646 - accuracy: 0.3410 -


val_loss: 1.2729 - val_accuracy: 0.3935

Epoch 61/100

113/113 [==============================] - 117s 1s/step - loss: 1.3704 - accuracy: 0.3475 -


val_loss: 1.2938 - val_accuracy: 0.3684

Epoch 62/100

113/113 [==============================] - 116s 1s/step - loss: 1.3549 - accuracy: 0.3597 -


val_loss: 1.3774 - val_accuracy: 0.3784

Epoch 63/100

113/113 [==============================] - 116s 1s/step - loss: 1.3778 - accuracy: 0.3447 -


val_loss: 1.2788 - val_accuracy: 0.3885

Epoch 64/100

113/113 [==============================] - 117s 1s/step - loss: 1.3652 - accuracy: 0.3424 -


val_loss: 1.2787 - val_accuracy: 0.4060

Epoch 65/100

113/113 [==============================] - 116s 1s/step - loss: 1.3745 - accuracy: 0.3542 -


val_loss: 1.2709 - val_accuracy: 0.4236

Epoch 66/100

113/113 [==============================] - 116s 1s/step - loss: 1.3790 - accuracy: 0.3486 -


val_loss: 1.2772 - val_accuracy: 0.3935

Epoch 67/100

113/113 [==============================] - 116s 1s/step - loss: 1.3712 - accuracy: 0.3519 -


val_loss: 1.2937 - val_accuracy: 0.3734
Epoch 68/100

113/113 [==============================] - 117s 1s/step - loss: 1.3675 - accuracy: 0.3461 -


val_loss: 1.2783 - val_accuracy: 0.4135

Epoch 69/100

113/113 [==============================] - 117s 1s/step - loss: 1.3749 - accuracy: 0.3452 -


val_loss: 1.2760 - val_accuracy: 0.4035

Epoch 70/100

113/113 [==============================] - 117s 1s/step - loss: 1.3731 - accuracy: 0.3497 -


val_loss: 1.2795 - val_accuracy: 0.3709

Epoch 71/100

113/113 [==============================] - 116s 1s/step - loss: 1.3676 - accuracy: 0.3477 -


val_loss: 1.3222 - val_accuracy: 0.3434

Epoch 72/100

113/113 [==============================] - 117s 1s/step - loss: 1.3711 - accuracy: 0.3508 -


val_loss: 1.2876 - val_accuracy: 0.3784

Epoch 73/100

113/113 [==============================] - 117s 1s/step - loss: 1.3705 - accuracy: 0.3327 -


val_loss: 1.2771 - val_accuracy: 0.3960

Epoch 74/100

113/113 [==============================] - 117s 1s/step - loss: 1.3644 - accuracy: 0.3377 -


val_loss: 1.2798 - val_accuracy: 0.4110

Epoch 75/100

113/113 [==============================] - 116s 1s/step - loss: 1.3645 - accuracy: 0.3413 -


val_loss: 1.2681 - val_accuracy: 0.4211

Epoch 76/100

113/113 [==============================] - 116s 1s/step - loss: 1.3660 - accuracy: 0.3503 -


val_loss: 1.2965 - val_accuracy: 0.3584

Epoch 77/100

113/113 [==============================] - 116s 1s/step - loss: 1.3739 - accuracy: 0.3558 -


val_loss: 1.2717 - val_accuracy: 0.4311

Epoch 78/100

113/113 [==============================] - 116s 1s/step - loss: 1.3611 - accuracy: 0.3469 -


val_loss: 1.2766 - val_accuracy: 0.3810

Epoch 79/100
113/113 [==============================] - 117s 1s/step - loss: 1.3654 - accuracy: 0.3397 -
val_loss: 1.2735 - val_accuracy: 0.4110

Epoch 80/100

113/113 [==============================] - 117s 1s/step - loss: 1.3719 - accuracy: 0.3419 -


val_loss: 1.2864 - val_accuracy: 0.3358

Epoch 81/100

113/113 [==============================] - 117s 1s/step - loss: 1.3762 - accuracy: 0.3441 -


val_loss: 1.2815 - val_accuracy: 0.3860

Epoch 82/100

113/113 [==============================] - 116s 1s/step - loss: 1.3564 - accuracy: 0.3410 -


val_loss: 1.3021 - val_accuracy: 0.3659

Epoch 83/100

113/113 [==============================] - 117s 1s/step - loss: 1.3704 - accuracy: 0.3349 -


val_loss: 1.2810 - val_accuracy: 0.3910

Epoch 84/100

113/113 [==============================] - 117s 1s/step - loss: 1.3691 - accuracy: 0.3391 -


val_loss: 1.2844 - val_accuracy: 0.3885

Epoch 85/100

113/113 [==============================] - 117s 1s/step - loss: 1.3575 - accuracy: 0.3405 -


val_loss: 1.2818 - val_accuracy: 0.3810

Epoch 86/100

113/113 [==============================] - 116s 1s/step - loss: 1.3737 - accuracy: 0.3516 -


val_loss: 1.2908 - val_accuracy: 0.4436

Epoch 87/100

113/113 [==============================] - 117s 1s/step - loss: 1.3677 - accuracy: 0.3472 -


val_loss: 1.2799 - val_accuracy: 0.4411

Epoch 88/100

113/113 [==============================] - 116s 1s/step - loss: 1.3667 - accuracy: 0.3433 -


val_loss: 1.2828 - val_accuracy: 0.4160

Epoch 89/100

113/113 [==============================] - 116s 1s/step - loss: 1.3564 - accuracy: 0.3572 -


val_loss: 1.2759 - val_accuracy: 0.4060

Epoch 90/100

113/113 [==============================] - 116s 1s/step - loss: 1.3633 - accuracy: 0.3508 -


val_loss: 1.2762 - val_accuracy: 0.4110
Epoch 91/100

113/113 [==============================] - 119s 1s/step - loss: 1.3598 - accuracy: 0.3547 -


val_loss: 1.2607 - val_accuracy: 0.4035

Epoch 92/100

113/113 [==============================] - 116s 1s/step - loss: 1.3601 - accuracy: 0.3555 -


val_loss: 1.2747 - val_accuracy: 0.4060

Epoch 93/100

113/113 [==============================] - 116s 1s/step - loss: 1.3684 - accuracy: 0.3416 -


val_loss: 1.2798 - val_accuracy: 0.3784

Epoch 94/100

113/113 [==============================] - 117s 1s/step - loss: 1.3659 - accuracy: 0.3475 -


val_loss: 1.3087 - val_accuracy: 0.3559

Epoch 95/100

113/113 [==============================] - 117s 1s/step - loss: 1.3720 - accuracy: 0.3500 -


val_loss: 1.2728 - val_accuracy: 0.4110

Epoch 96/100

113/113 [==============================] - 116s 1s/step - loss: 1.3649 - accuracy: 0.3558 -


val_loss: 1.2890 - val_accuracy: 0.3709

Epoch 97/100

113/113 [==============================] - 116s 1s/step - loss: 1.3784 - accuracy: 0.3338 -


val_loss: 1.2952 - val_accuracy: 0.3609

Epoch 98/100

113/113 [==============================] - 117s 1s/step - loss: 1.3640 - accuracy: 0.3578 -


val_loss: 1.2746 - val_accuracy: 0.4185

Epoch 99/100

113/113 [==============================] - 117s 1s/step - loss: 1.3621 - accuracy: 0.3689 -


val_loss: 1.2616 - val_accuracy: 0.4586

Epoch 100/100

113/113 [==============================] - 116s 1s/step - loss: 1.3613 - accuracy: 0.3463 -


val_loss: 1.2708 - val_accuracy: 0.4135

52/52 [==============================] - 48s 923ms/step - loss: 1.2283 - accuracy: 0.3853

Test Accuracy: 38.53%

52/52 [==============================] - 26s 484ms/step


C:\Users\benin\anaconda3\envs\ultralytics\lib\site-packages\sklearn\metrics\
_classification.py:1318: UndefinedMetricWarning: Precision and F-score are ill-defined and being set
to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.

_warn_prf(average, modifier, msg_start, len(result))

C:\Users\benin\anaconda3\envs\ultralytics\lib\site-packages\sklearn\metrics\
_classification.py:1318: UndefinedMetricWarning: Precision and F-score are ill-defined and being set
to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.

_warn_prf(average, modifier, msg_start, len(result))

C:\Users\benin\anaconda3\envs\ultralytics\lib\site-packages\sklearn\metrics\
_classification.py:1318: UndefinedMetricWarning: Precision and F-score are ill-defined and being set
to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.

_warn_prf(average, modifier, msg_start, len(result))

Classification Report:

precision recall f1-score support

0 0.76 0.29 0.42 639

1 0.23 0.66 0.34 296

2 0.46 0.32 0.38 447

3 0.46 0.51 0.48 223

4 0.00 0.00 0.00 51

accuracy 0.39 1656

macro avg 0.38 0.36 0.32 1656

weighted avg 0.52 0.39 0.39 1656

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy