0% found this document useful (0 votes)
72 views9 pages

La Praktikum m3

1) The document discusses building and training a generative adversarial network (GAN) using Keras and TensorFlow to generate images of handwritten digits from the MNIST dataset. 2) It imports necessary packages, defines variables for the neural network architecture and data, and builds the generator and discriminator models with sequential layers including dense, LeakyReLU, and reshape layers. 3) The generator and discriminator are connected and compiled into a GAN model. The GAN is then trained on batches of real and generated images for multiple epochs, with generator and discriminator losses printed at each step. Generated images are saved periodically during training.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views9 pages

La Praktikum m3

1) The document discusses building and training a generative adversarial network (GAN) using Keras and TensorFlow to generate images of handwritten digits from the MNIST dataset. 2) It imports necessary packages, defines variables for the neural network architecture and data, and builds the generator and discriminator models with sequential layers including dense, LeakyReLU, and reshape layers. 3) The generator and discriminator are connected and compiled into a GAN model. The GAN is then trained on batches of real and generated images for multiple epochs, with generator and discriminator losses printed at each step. Generated images are saved periodically during training.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

1) Importing Python Packages for GAN

In [6]:

from keras.datasets import mnist

from keras.models import Sequential


from keras.layers import BatchNormalization
from keras.layers import Dense, Reshape, Flatten
from keras.layers import LeakyReLU
from tensorflow.keras.optimizers import Adam

import numpy as np
!mkdir generated_images

2) Variables for Neural Networks & Data


In [7]:

img_width = 28
img_height = 28
channels = 1
img_shape = (img_width, img_height, channels)
latent_dim = 100
adam = Adam(lr=0.0001)

/usr/local/lib/python3.7/dist-packages/keras/optimizers/optimizer_v2/adam.py:110: UserWarn
ing: The `lr` argument is deprecated, use `learning_rate` instead.
super(Adam, self).__init__(name, **kwargs)

3) Building Generator
In [8]:

def build_generator():
model = Sequential()

model.add(Dense(256, input_dim=latent_dim))
model.add(LeakyReLU(alpha=0.2))
model.add(BatchNormalization(momentum=0.8))

model.add(Dense(256))
model.add(LeakyReLU(alpha=0.2))
model.add(BatchNormalization(momentum=0.8))

model.add(Dense(256))
model.add(LeakyReLU(alpha=0.2))
model.add(BatchNormalization(momentum=0.8))

model.add(Dense(np.prod(img_shape), activation='tanh'))
model.add(Reshape(img_shape))

model.summary()
return model

generator = build_generator()

Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 256) 25856

leaky_re_lu (LeakyReLU) (None, 256) 0


batch_normalization (BatchN (None, 256) 1024
ormalization)

dense_1 (Dense) (None, 256) 65792

leaky_re_lu_1 (LeakyReLU) (None, 256) 0

batch_normalization_1 (Batc (None, 256) 1024


hNormalization)

dense_2 (Dense) (None, 256) 65792

leaky_re_lu_2 (LeakyReLU) (None, 256) 0

batch_normalization_2 (Batc (None, 256) 1024


hNormalization)

dense_3 (Dense) (None, 784) 201488

reshape (Reshape) (None, 28, 28, 1) 0

=================================================================
Total params: 362,000
Trainable params: 360,464
Non-trainable params: 1,536
_________________________________________________________________

4) Building Discriminator
In [9]:
def build_discriminator():
model = Sequential()

model.add(Flatten(input_shape=img_shape))
model.add(Dense(512))
model.add(LeakyReLU(alpha=0.2))
model.add(Dense(256))
model.add(Dense(1, activation='sigmoid'))

model.summary()
return model

discriminator = build_discriminator()
discriminator.compile(loss='binary_crossentropy', optimizer=adam, metrics=['accuracy'])

Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
flatten (Flatten) (None, 784) 0

dense_4 (Dense) (None, 512) 401920

leaky_re_lu_3 (LeakyReLU) (None, 512) 0

dense_5 (Dense) (None, 256) 131328

dense_6 (Dense) (None, 1) 257

=================================================================
Total params: 533,505
Trainable params: 533,505
Non-trainable params: 0
_________________________________________________________________

5) Connecting Neural Networks to build GAN


In [10]:
GAN = Sequential()
discriminator.trainable = False
GAN.add(generator)
GAN.add(discriminator)

GAN.compile(loss='binary_crossentropy', optimizer=adam)
GAN.summary()

Model: "sequential_2"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
sequential (Sequential) (None, 28, 28, 1) 362000

sequential_1 (Sequential) (None, 1) 533505

=================================================================
Total params: 895,505
Trainable params: 360,464
Non-trainable params: 535,041
_________________________________________________________________

6) Outputting Images
In [11]:

#@title
## **7) Outputting Images**
import matplotlib.pyplot as plt
import glob
import imageio
import PIL

save_name = 0.00000000

def save_imgs(epoch):
r, c = 5, 5
noise = np.random.normal(0, 1, (r * c, latent_dim))
gen_imgs = generator.predict(noise)
global save_name
save_name += 0.00000001
print("%.8f" % save_name)

# Rescale images 0 - 1
gen_imgs = 0.5 * gen_imgs + 0.5

fig, axs = plt.subplots(r, c)


cnt = 0
for i in range(r):
for j in range(c):
axs[i,j].imshow(gen_imgs[cnt, :,:,0], cmap='gray')
# axs[i,j].imshow(gen_imgs[cnt])
axs[i,j].axis('off')
cnt += 1
fig.savefig("generated_images/%.8f.png" % save_name)
print('saved')
plt.close()

7) Training GAN
In [12]:
def train(epochs, batch_size=64, save_interval=200):
(X_train, _), (_, _) = mnist.load_data()

# print(X_train.shape)
#Rescale data between -1 and 1
X_train = X_train / 127.5 -1.
# X_train = np.expand_dims(X_train, axis=3)
# print(X_train.shape)

#Create our Y for our Neural Networks


valid = np.ones((batch_size, 1))
fakes = np.zeros((batch_size, 1))

for epoch in range(epochs):


#Get Random Batch
idx = np.random.randint(0, X_train.shape[0], batch_size)
imgs = X_train[idx]

#Generate Fake Images


noise = np.random.normal(0, 1, (batch_size, latent_dim))
gen_imgs = generator.predict(noise)

#Train discriminator
d_loss_real = discriminator.train_on_batch(imgs, valid)
d_loss_fake = discriminator.train_on_batch(gen_imgs, fakes)
d_loss = 0.5 * np.add(d_loss_real, d_loss_fake)

noise = np.random.normal(0, 1, (batch_size, latent_dim))

#inverse y label
g_loss = GAN.train_on_batch(noise, valid)

print("******* %d [D loss: %f , acc: %.2f%%] [G loss: %f ]" % (epoch, d_loss[0], 100* d


_loss[1], g_loss))

if (epoch % save_interval) == 0:
save_imgs(epoch)

# print(valid)

train(30000, batch_size=64, save_interval=200)

Streaming output truncated to the last 5000 lines.


2/2 [==============================] - 0s 5ms/step
******* 27518 [D loss: 0.484928, acc: 77.34%] [G loss: 1.363376]
2/2 [==============================] - 0s 5ms/step
******* 27519 [D loss: 0.600889, acc: 71.09%] [G loss: 1.262974]
2/2 [==============================] - 0s 6ms/step
******* 27520 [D loss: 0.628647, acc: 65.62%] [G loss: 1.172042]
2/2 [==============================] - 0s 6ms/step
******* 27521 [D loss: 0.578565, acc: 68.75%] [G loss: 1.006741]
2/2 [==============================] - 0s 4ms/step
******* 27522 [D loss: 0.675294, acc: 62.50%] [G loss: 1.118249]
2/2 [==============================] - 0s 7ms/step
******* 27523 [D loss: 0.606178, acc: 71.09%] [G loss: 1.202019]
2/2 [==============================] - 0s 5ms/step
******* 27524 [D loss: 0.577705, acc: 72.66%] [G loss: 1.358428]
2/2 [==============================] - 0s 8ms/step
******* 27525 [D loss: 0.614934, acc: 67.19%] [G loss: 1.408203]
2/2 [==============================] - 0s 4ms/step
******* 27526 [D loss: 0.624310, acc: 64.84%] [G loss: 1.455104]
2/2 [==============================] - 0s 5ms/step
******* 27527 [D loss: 0.637720, acc: 65.62%] [G loss: 1.206390]
2/2 [==============================] - 0s 7ms/step
******* 27528 [D loss: 0.580422, acc: 74.22%] [G loss: 1.186193]
2/2 [==============================] - 0s 5ms/step
******* 27529 [D loss: 0.616732, acc: 64.84%] [G loss: 1.137124]
2/2 [==============================] - 0s 5ms/step
******* 27530 [D loss: 0.537695, acc: 71.88%] [G loss: 1.224113]
2/2 [==============================] - 0s 7ms/step
******* 27531 [D loss: 0.503431, acc: 72.66%] [G loss: 1.385631]
2/2 [==============================] - 0s 6ms/step
******* 27532 [D loss: 0.543676, acc: 71.09%] [G loss: 1.528680]
2/2 [==============================] - 0s 6ms/step
******* 27533 [D loss: 0.601877, acc: 68.75%] [G loss: 1.466871]
2/2 [==============================] - 0s 6ms/step
******* 27534 [D loss: 0.654370, acc: 63.28%] [G loss: 1.062838]
2/2 [==============================] - 0s 9ms/step
******* 29863 [D loss: 0.623452, acc: 67.19%] [G loss: 1.409948]
2/2 [==============================] - 0s 4ms/step
******* 29864 [D loss: 0.542609, acc: 74.22%] [G loss: 1.172149]
2/2 [==============================] - 0s 7ms/step
******* 29865 [D loss: 0.625219, acc: 65.62%] [G loss: 1.020908]
2/2 [==============================] - 0s 5ms/step
******* 29866 [D loss: 0.583065, acc: 73.44%] [G loss: 1.155057]
2/2 [==============================] - 0s 4ms/step
******* 29867 [D loss: 0.576574, acc: 70.31%] [G loss: 1.256038]
2/2 [==============================] - 0s 6ms/step
******* 29868 [D loss: 0.568287, acc: 69.53%] [G loss: 1.301167]
2/2 [==============================] - 0s 8ms/step
******* 29869 [D loss: 0.655900, acc: 66.41%] [G loss: 1.149893]
2/2 [==============================] - 0s 5ms/step
******* 29870 [D loss: 0.608220, acc: 72.66%] [G loss: 1.271738]
2/2 [==============================] - 0s 5ms/step
******* 29871 [D loss: 0.594480, acc: 67.97%] [G loss: 1.228074]
2/2 [==============================] - 0s 6ms/step
******* 29872 [D loss: 0.540134, acc: 71.88%] [G loss: 1.311851]
2/2 [==============================] - 0s 8ms/step
******* 29873 [D loss: 0.610670, acc: 68.75%] [G loss: 1.280084]
2/2 [==============================] - 0s 6ms/step
******* 29874 [D loss: 0.496260, acc: 77.34%] [G loss: 1.255110]
2/2 [==============================] - 0s 6ms/step
******* 29875 [D loss: 0.701481, acc: 56.25%] [G loss: 1.188344]
2/2 [==============================] - 0s 6ms/step
******* 29876 [D loss: 0.661403, acc: 62.50%] [G loss: 1.132108]
2/2 [==============================] - 0s 9ms/step
******* 29877 [D loss: 0.654212, acc: 65.62%] [G loss: 1.086925]
2/2 [==============================] - 0s 13ms/step
******* 29878 [D loss: 0.618295, acc: 67.19%] [G loss: 1.164721]
2/2 [==============================] - 0s 5ms/step
******* 29879 [D loss: 0.520297, acc: 75.78%] [G loss: 1.123332]
2/2 [==============================] - 0s 9ms/step
******* 29880 [D loss: 0.553012, acc: 71.88%] [G loss: 1.283132]
2/2 [==============================] - 0s 8ms/step
******* 29881 [D loss: 0.547159, acc: 70.31%] [G loss: 1.374631]
2/2 [==============================] - 0s 4ms/step
******* 29882 [D loss: 0.536171, acc: 74.22%] [G loss: 1.275265]
2/2 [==============================] - 0s 6ms/step
******* 29883 [D loss: 0.629597, acc: 59.38%] [G loss: 1.039468]
2/2 [==============================] - 0s 6ms/step
******* 29884 [D loss: 0.593872, acc: 74.22%] [G loss: 1.073526]
2/2 [==============================] - 0s 11ms/step
******* 29885 [D loss: 0.525715, acc: 71.88%] [G loss: 1.254506]
2/2 [==============================] - 0s 4ms/step
******* 29886 [D loss: 0.536416, acc: 74.22%] [G loss: 1.422001]
2/2 [==============================] - 0s 6ms/step
******* 29887 [D loss: 0.630727, acc: 67.19%] [G loss: 1.484703]
2/2 [==============================] - 0s 6ms/step
******* 29888 [D loss: 0.639953, acc: 63.28%] [G loss: 1.229375]
2/2 [==============================] - 0s 4ms/step
******* 29889 [D loss: 0.613540, acc: 66.41%] [G loss: 0.990796]
2/2 [==============================] - 0s 5ms/step
******* 29890 [D loss: 0.555331, acc: 72.66%] [G loss: 1.112627]
2/2 [==============================] - 0s 12ms/step
******* 29891 [D loss: 0.542436, acc: 76.56%] [G loss: 1.129661]
2/2 [==============================] - 0s 6ms/step
******* 29892 [D loss: 0.580063, acc: 67.19%] [G loss: 1.215682]
2/2 [==============================] - 0s 4ms/step
******* 29893 [D loss: 0.515409, acc: 78.12%] [G loss: 1.328687]
2/2 [==============================] - 0s 7ms/step
******* 29894 [D loss: 0.631865, acc: 67.97%] [G loss: 1.267977]
2/2 [==============================] - 0s 14ms/step
******* 29895 [D loss: 0.613308, acc: 64.06%] [G loss: 1.121264]
2/2 [==============================] - 0s 6ms/step
******* 29896 [D loss: 0.480486, acc: 77.34%] [G loss: 1.309521]
2/2 [==============================] - 0s 9ms/step
******* 29897 [D loss: 0.554778, acc: 74.22%] [G loss: 1.268197]
2/2 [==============================] - 0s 6ms/step
******* 29898 [D loss: 0.664903, acc: 64.06%] [G loss: 1.332328]
2/2 [==============================] - 0s 7ms/step
2/2 [==============================] - 0s 7ms/step
******* 29899 [D loss: 0.605389, acc: 70.31%] [G loss: 1.193784]
2/2 [==============================] - 0s 5ms/step
******* 29900 [D loss: 0.640663, acc: 64.84%] [G loss: 1.235355]
2/2 [==============================] - 0s 4ms/step
******* 29901 [D loss: 0.575814, acc: 71.09%] [G loss: 1.221311]
2/2 [==============================] - 0s 10ms/step
******* 29902 [D loss: 0.626534, acc: 71.88%] [G loss: 1.253850]
2/2 [==============================] - 0s 5ms/step
******* 29903 [D loss: 0.603055, acc: 67.19%] [G loss: 1.261721]
2/2 [==============================] - 0s 7ms/step
******* 29904 [D loss: 0.735876, acc: 62.50%] [G loss: 1.220604]
2/2 [==============================] - 0s 6ms/step
******* 29905 [D loss: 0.605115, acc: 65.62%] [G loss: 1.112600]
2/2 [==============================] - 0s 5ms/step
******* 29906 [D loss: 0.625115, acc: 66.41%] [G loss: 1.220238]
2/2 [==============================] - 0s 8ms/step
******* 29907 [D loss: 0.577993, acc: 67.97%] [G loss: 1.159944]
2/2 [==============================] - 0s 8ms/step
******* 29908 [D loss: 0.550957, acc: 74.22%] [G loss: 1.290124]
2/2 [==============================] - 0s 10ms/step
******* 29909 [D loss: 0.637799, acc: 64.06%] [G loss: 1.455835]
2/2 [==============================] - 0s 5ms/step
******* 29910 [D loss: 0.635876, acc: 65.62%] [G loss: 1.216055]
2/2 [==============================] - 0s 7ms/step
******* 29911 [D loss: 0.633757, acc: 66.41%] [G loss: 0.844575]
2/2 [==============================] - 0s 7ms/step
******* 29912 [D loss: 0.544526, acc: 71.09%] [G loss: 1.100386]
2/2 [==============================] - 0s 8ms/step
******* 29913 [D loss: 0.604366, acc: 67.19%] [G loss: 1.279352]
2/2 [==============================] - 0s 5ms/step
******* 29914 [D loss: 0.623141, acc: 61.72%] [G loss: 1.315183]
2/2 [==============================] - 0s 4ms/step
******* 29915 [D loss: 0.593884, acc: 68.75%] [G loss: 1.161430]
2/2 [==============================] - 0s 4ms/step
******* 29916 [D loss: 0.562622, acc: 67.97%] [G loss: 1.314931]
2/2 [==============================] - 0s 8ms/step
******* 29917 [D loss: 0.551400, acc: 76.56%] [G loss: 1.188707]
2/2 [==============================] - 0s 6ms/step
******* 29918 [D loss: 0.542793, acc: 75.00%] [G loss: 1.134578]
2/2 [==============================] - 0s 6ms/step
******* 29919 [D loss: 0.542166, acc: 73.44%] [G loss: 1.142911]
2/2 [==============================] - 0s 4ms/step
******* 29920 [D loss: 0.610307, acc: 64.84%] [G loss: 1.203183]
2/2 [==============================] - 0s 4ms/step
******* 29921 [D loss: 0.540264, acc: 73.44%] [G loss: 1.316001]
2/2 [==============================] - 0s 4ms/step
******* 29922 [D loss: 0.603281, acc: 68.75%] [G loss: 1.390278]
2/2 [==============================] - 0s 4ms/step
******* 29923 [D loss: 0.695459, acc: 64.06%] [G loss: 1.254979]
2/2 [==============================] - 0s 4ms/step
******* 29924 [D loss: 0.595394, acc: 71.88%] [G loss: 1.107866]
2/2 [==============================] - 0s 5ms/step
******* 29925 [D loss: 0.661634, acc: 63.28%] [G loss: 0.985243]
2/2 [==============================] - 0s 3ms/step
******* 29926 [D loss: 0.596458, acc: 64.84%] [G loss: 1.160980]
2/2 [==============================] - 0s 5ms/step
******* 29927 [D loss: 0.610984, acc: 67.19%] [G loss: 1.224946]
2/2 [==============================] - 0s 8ms/step
******* 29928 [D loss: 0.577828, acc: 71.09%] [G loss: 1.238572]
2/2 [==============================] - 0s 5ms/step
******* 29929 [D loss: 0.589287, acc: 69.53%] [G loss: 1.334740]
2/2 [==============================] - 0s 7ms/step
******* 29930 [D loss: 0.607593, acc: 68.75%] [G loss: 1.156895]
2/2 [==============================] - 0s 8ms/step
******* 29931 [D loss: 0.589881, acc: 66.41%] [G loss: 1.029689]
2/2 [==============================] - 0s 7ms/step
******* 29932 [D loss: 0.569632, acc: 67.19%] [G loss: 1.152985]
2/2 [==============================] - 0s 7ms/step
******* 29933 [D loss: 0.615932, acc: 66.41%] [G loss: 1.269793]
2/2 [==============================] - 0s 6ms/step
******* 29934 [D loss: 0.556390, acc: 72.66%] [G loss: 1.141622]
2/2 [==============================] - 0s 12ms/step
2/2 [==============================] - 0s 12ms/step
******* 29935 [D loss: 0.608344, acc: 67.19%] [G loss: 1.203616]
2/2 [==============================] - 0s 8ms/step
******* 29936 [D loss: 0.656705, acc: 63.28%] [G loss: 1.087211]
2/2 [==============================] - 0s 4ms/step
******* 29937 [D loss: 0.589256, acc: 69.53%] [G loss: 1.240844]
2/2 [==============================] - 0s 6ms/step
******* 29938 [D loss: 0.669647, acc: 57.81%] [G loss: 1.171465]
2/2 [==============================] - 0s 6ms/step
******* 29939 [D loss: 0.617651, acc: 69.53%] [G loss: 1.204986]
2/2 [==============================] - 0s 8ms/step
******* 29940 [D loss: 0.589308, acc: 67.19%] [G loss: 1.119389]
2/2 [==============================] - 0s 7ms/step
******* 29941 [D loss: 0.611637, acc: 66.41%] [G loss: 1.123769]
2/2 [==============================] - 0s 8ms/step
******* 29942 [D loss: 0.608144, acc: 67.19%] [G loss: 1.237388]
2/2 [==============================] - 0s 5ms/step
******* 29943 [D loss: 0.663953, acc: 60.94%] [G loss: 1.246949]
2/2 [==============================] - 0s 5ms/step
******* 29944 [D loss: 0.651479, acc: 64.06%] [G loss: 1.238617]
2/2 [==============================] - 0s 4ms/step
******* 29945 [D loss: 0.654096, acc: 60.94%] [G loss: 1.230141]
2/2 [==============================] - 0s 6ms/step
******* 29946 [D loss: 0.540784, acc: 72.66%] [G loss: 1.229845]
2/2 [==============================] - 0s 4ms/step
******* 29947 [D loss: 0.570492, acc: 74.22%] [G loss: 1.312420]
2/2 [==============================] - 0s 9ms/step
******* 29948 [D loss: 0.595098, acc: 69.53%] [G loss: 1.156778]
2/2 [==============================] - 0s 6ms/step
******* 29949 [D loss: 0.672523, acc: 59.38%] [G loss: 1.016071]
2/2 [==============================] - 0s 6ms/step
******* 29950 [D loss: 0.638359, acc: 67.19%] [G loss: 1.065424]
2/2 [==============================] - 0s 6ms/step
******* 29951 [D loss: 0.551705, acc: 76.56%] [G loss: 1.198018]
2/2 [==============================] - 0s 7ms/step
******* 29952 [D loss: 0.545436, acc: 71.09%] [G loss: 1.248866]
2/2 [==============================] - 0s 9ms/step
******* 29953 [D loss: 0.568265, acc: 71.09%] [G loss: 1.240091]
2/2 [==============================] - 0s 6ms/step
******* 29954 [D loss: 0.642307, acc: 64.06%] [G loss: 1.193786]
2/2 [==============================] - 0s 7ms/step
******* 29955 [D loss: 0.688531, acc: 60.16%] [G loss: 1.334040]
2/2 [==============================] - 0s 4ms/step
******* 29956 [D loss: 0.654302, acc: 63.28%] [G loss: 1.203178]
2/2 [==============================] - 0s 7ms/step
******* 29957 [D loss: 0.718338, acc: 63.28%] [G loss: 1.135725]
2/2 [==============================] - 0s 5ms/step
******* 29958 [D loss: 0.664015, acc: 62.50%] [G loss: 1.160858]
2/2 [==============================] - 0s 6ms/step
******* 29959 [D loss: 0.562412, acc: 67.19%] [G loss: 1.203674]
2/2 [==============================] - 0s 7ms/step
******* 29960 [D loss: 0.572852, acc: 70.31%] [G loss: 1.049615]
2/2 [==============================] - 0s 4ms/step
******* 29961 [D loss: 0.656629, acc: 64.84%] [G loss: 1.124449]
2/2 [==============================] - 0s 6ms/step
******* 29962 [D loss: 0.640985, acc: 68.75%] [G loss: 1.235999]
2/2 [==============================] - 0s 5ms/step
******* 29963 [D loss: 0.593317, acc: 66.41%] [G loss: 1.239542]
2/2 [==============================] - 0s 6ms/step
******* 29964 [D loss: 0.606864, acc: 67.19%] [G loss: 1.275741]
2/2 [==============================] - 0s 8ms/step
******* 29965 [D loss: 0.646582, acc: 62.50%] [G loss: 1.194611]
2/2 [==============================] - 0s 5ms/step
******* 29966 [D loss: 0.587659, acc: 64.06%] [G loss: 1.146992]
2/2 [==============================] - 0s 6ms/step
******* 29967 [D loss: 0.547437, acc: 72.66%] [G loss: 1.137243]
2/2 [==============================] - 0s 5ms/step
******* 29968 [D loss: 0.621019, acc: 64.84%] [G loss: 1.065349]
2/2 [==============================] - 0s 4ms/step
******* 29969 [D loss: 0.629466, acc: 67.19%] [G loss: 1.321414]
2/2 [==============================] - 0s 4ms/step
******* 29970 [D loss: 0.597810, acc: 65.62%] [G loss: 1.307625]
2/2 [==============================] - 0s 4ms/step
******* 29971 [D loss: 0.607508, acc: 72.66%] [G loss: 1.364951]
2/2 [==============================] - 0s 4ms/step
******* 29972 [D loss: 0.622211, acc: 68.75%] [G loss: 1.222455]
2/2 [==============================] - 0s 4ms/step
******* 29973 [D loss: 0.679487, acc: 59.38%] [G loss: 1.047787]
2/2 [==============================] - 0s 5ms/step
******* 29974 [D loss: 0.548078, acc: 71.88%] [G loss: 1.229700]
2/2 [==============================] - 0s 5ms/step
******* 29975 [D loss: 0.679821, acc: 62.50%] [G loss: 1.126660]
2/2 [==============================] - 0s 4ms/step
******* 29976 [D loss: 0.605525, acc: 69.53%] [G loss: 1.222195]
2/2 [==============================] - 0s 4ms/step
******* 29977 [D loss: 0.541331, acc: 72.66%] [G loss: 1.410168]
2/2 [==============================] - 0s 3ms/step
******* 29978 [D loss: 0.647117, acc: 62.50%] [G loss: 1.244983]
2/2 [==============================] - 0s 6ms/step
******* 29979 [D loss: 0.675301, acc: 60.16%] [G loss: 1.098966]
2/2 [==============================] - 0s 5ms/step
******* 29980 [D loss: 0.680653, acc: 61.72%] [G loss: 1.047067]
2/2 [==============================] - 0s 6ms/step
******* 29981 [D loss: 0.610733, acc: 66.41%] [G loss: 1.356553]
2/2 [==============================] - 0s 6ms/step
******* 29982 [D loss: 0.656336, acc: 63.28%] [G loss: 1.248860]
2/2 [==============================] - 0s 4ms/step
******* 29983 [D loss: 0.546962, acc: 76.56%] [G loss: 1.215411]
2/2 [==============================] - 0s 4ms/step
******* 29984 [D loss: 0.709845, acc: 53.91%] [G loss: 1.122472]
2/2 [==============================] - 0s 4ms/step
******* 29985 [D loss: 0.613870, acc: 65.62%] [G loss: 1.108248]
2/2 [==============================] - 0s 5ms/step
******* 29986 [D loss: 0.604478, acc: 67.19%] [G loss: 1.210661]
2/2 [==============================] - 0s 4ms/step
******* 29987 [D loss: 0.568745, acc: 69.53%] [G loss: 1.281371]
2/2 [==============================] - 0s 6ms/step
******* 29988 [D loss: 0.648623, acc: 69.53%] [G loss: 1.187950]
2/2 [==============================] - 0s 6ms/step
******* 29989 [D loss: 0.681824, acc: 64.06%] [G loss: 1.060595]
2/2 [==============================] - 0s 4ms/step
******* 29990 [D loss: 0.568510, acc: 70.31%] [G loss: 1.049071]
2/2 [==============================] - 0s 4ms/step
******* 29991 [D loss: 0.637266, acc: 63.28%] [G loss: 1.062266]
2/2 [==============================] - 0s 4ms/step
******* 29992 [D loss: 0.706634, acc: 57.81%] [G loss: 1.090986]
2/2 [==============================] - 0s 7ms/step
******* 29993 [D loss: 0.611674, acc: 65.62%] [G loss: 1.212140]
2/2 [==============================] - 0s 4ms/step
******* 29994 [D loss: 0.603303, acc: 65.62%] [G loss: 1.340815]
2/2 [==============================] - 0s 5ms/step
******* 29995 [D loss: 0.658634, acc: 64.84%] [G loss: 1.209580]
2/2 [==============================] - 0s 4ms/step
******* 29996 [D loss: 0.635326, acc: 64.06%] [G loss: 1.228579]
2/2 [==============================] - 0s 5ms/step
******* 29997 [D loss: 0.610135, acc: 65.62%] [G loss: 1.113297]
2/2 [==============================] - 0s 5ms/step
******* 29998 [D loss: 0.562490, acc: 76.56%] [G loss: 1.191264]
2/2 [==============================] - 0s 5ms/step
******* 29999 [D loss: 0.607785, acc: 65.62%] [G loss: 1.229325]

8) Making GIF

In [13]:
# Display a single image using the epoch number
# def display_image(epoch_no):
# return PIL.Image.open('generated_images/%.8f.png'.format(epoch_no))

anim_file = 'dcgan.gif'

with imageio.get_writer(anim_file, mode='I') as writer:


filenames = glob.glob('generated_images/*.png')
filenames = sorted(filenames)
for filename in filenames:
image = imageio.imread(filename)
writer.append_data(image)
image = imageio.imread(filename)
writer.append_data(image)

In [13]:

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy