0% found this document useful (0 votes)
44 views4 pages

Machine Vison Homework

This document describes machine vision homework involving curve fitting a sinusoidal signal using a neural network built in TensorFlow. The neural network takes noisy data points of a sine wave as input and trains over 100 epochs to predict the output sine wave. Plots of the training process and predicted vs actual outputs are generated and saved. The conclusion states that the network architecture uses dense linear and relu layers to fit the curve to the noisy input data.

Uploaded by

Teddy Bz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views4 pages

Machine Vison Homework

This document describes machine vision homework involving curve fitting a sinusoidal signal using a neural network built in TensorFlow. The neural network takes noisy data points of a sine wave as input and trains over 100 epochs to predict the output sine wave. Plots of the training process and predicted vs actual outputs are generated and saved. The conclusion states that the network architecture uses dense linear and relu layers to fit the curve to the noisy input data.

Uploaded by

Teddy Bz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

MACHINE VISON HOMEWORK

[Document subtitle]

DECENBER, 2021
TEWODROS BZUAYEHU ASMAMAW
Curve fitting sionusoidal signal

import numpy as np
import matplotlib.pyplot as plt
from tensorflow import keras
from google.colab import files
import tensorflow as tf
import math

print ('TensorFlow version: ' + tf.__version__)


#then ,we create the training date. X_data composed of 1000 points,and
normal noise is added to the y_coordinate of each point

# Create noisy data


x_data = np.linspace(-10, 10, num=1000)
y_data = 0.1*x_data*np.cos(x_data) + 0.1*np.random.normal(size=1000)
print('Data created successfully')
# Display the dataset
plt.scatter(x_data[::1], y_data[::1], s=2)
plt.grid()
plt.show()
plt.savefig('dataset.png',dpi=300)
files.download('dataset.png')

# Create the model


model = keras.Sequential()
model.add(keras.layers.Dense(units = 1, activation = 'linear', input_sh
ape=[1]))
model.add(keras.layers.Dense(units = 64, activation = 'relu'))
model.add(keras.layers.Dense(units = 64, activation = 'relu'))
model.add(keras.layers.Dense(units = 1, activation = 'linear'))
model.compile(loss='mse', optimizer="adam")
# Display the model
model.summary()
#x_ data is the input
#y_data is the expected output
#epochs = 100 means our network will be trained 100 times with our
dataset
#verbose =1 display progression and loss in the console

# Training
model.fit( x_data, y_data, epochs=100, verbose=1)
# Compute the output
y_predicted = model.predict(x_data)

# Display the result


plt.scatter(x_data[::1], y_data[::1], s=1)
plt.plot(x_data, y_predicted, 'r', linewidth=4)
plt.grid()
#plt.show()
plt.savefig('training.png', dpi=300)
files.download("training.png")
for x in range(100):
# One epoch
model.fit( x_data, y_data, epochs=1, verbose=1)
#then finally we can predict the output

# Compute the output

y_predicted = model.predict(x_data)

# Display the result


plt.scatter(x_data[::1], y_data[::1], s=1)
plt.plot(x_data, y_predicted, 'r', linewidth=4)
plt.grid()
#Plt.show()
plt.savefig('training.png', dpi=300)
files.download("training.png")
# Display the result
plt.scatter(x_data[::1], y_data[::1], s=2)
plt.plot(x_data, y_predicted, 'r', linewidth=4)
plt.grid()
plt.ylim(top=1.2) # adjust the top leaving bottom unchanged
plt.ylim(bottom=-1.2)
#plt.show()
plt.savefig('training-' + str(x) +'-epochs.png',dpi=300)
files.download('training-' + str(x) +'-epochs.png')
plt.clf()

conclusion
Once our training dataset is built, we can create our network:

- The first layer is single linear unit layer (for the input)
- Second layer is a &4 units Relu layer
- The third is 64 relu layer
- Last layer is a single linear unit (for the output)

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy