Skip to content

Implementation: Multi-Class Backpropagation #486

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Apr 12, 2017
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
Update test_learning.py
  • Loading branch information
antmarakis authored Apr 7, 2017
commit c04212342ff267610ecc820bafb10df8b2c78ca9
32 changes: 21 additions & 11 deletions tests/test_learning.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,23 +52,33 @@ def test_decision_tree_learner():

def test_neural_network_learner():
iris = DataSet(name="iris")
iris.remove_examples("virginica")


classes = ["setosa","versicolor","virginica"]
iris.classes_to_numbers()
iris.classes_to_numbers(classes)

nNL = NeuralNetLearner(iris, [5], 0.15, 75)
pred1 = nNL([5,3,1,0.1])
pred2 = nNL([6,3,3,1.5])
pred3 = nNL([7.5,4,6,2])

nNL = NeuralNetLearner(iris)
# NeuralNetLearner might be wrong. Just check if prediction is in range.
assert nNL([5,3,1,0.1]) in range(len(classes))
# NeuralNetLearner might be wrong. If it is, check if prediction is in range.
assert pred1 == 0 or pred1 in range(len(classes))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is ok for now. But in the long run, two things:
(1) There should be a way to test any learner, not just NeuralNetLearner
(2) To deal with the possibility of nondeterministic learners getting it wrong, there shoud be experiments, with overall results graded (for example, mst get at least 2 out of 3 answers right).

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the feedback, I appreciate it.

  1. I'm not sure I fully understand this, is it possible to elaborate? I could write a function which, receiving a learner as input, will run the input learner and return the result, if that's what you are asking.

  2. That is a great idea! I will get to work.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have implemented 2) on #496. I hope this is what you were asking for.

assert pred2 == 1 or pred2 in range(len(classes))
assert pred3 == 2 or pred3 in range(len(classes))


def test_perceptron():
iris = DataSet(name="iris")
iris.remove_examples("virginica")

classes = ["setosa","versicolor","virginica"]
iris.classes_to_numbers()

classes_number = len(iris.values[iris.target])

perceptron = PerceptronLearner(iris)
# PerceptronLearner might be wrong. Just check if prediction is in range.
assert perceptron([5,3,1,0.1]) in range(len(classes))
pred1 = perceptron([5,3,1,0.1])
pred2 = perceptron([6,3,4,1])
pred3 = perceptron([7.5,4,6,2])

# PerceptronLearner might be wrong. If it is, check if prediction is in range.
assert pred1 == 0 or pred1 in range(classes_number)
assert pred2 == 1 or pred2 in range(classes_number)
assert pred3 == 2 or pred3 in range(classes_number)
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy