-
-
Notifications
You must be signed in to change notification settings - Fork 8.3k
tests/run-tests.py: Add test statistics to output _result.json file. #17296
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tests/run-tests.py: Add test statistics to output _result.json file. #17296
Conversation
@hmaerki FYI |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #17296 +/- ##
=======================================
Coverage 98.54% 98.54%
=======================================
Files 169 169
Lines 21897 21897
=======================================
Hits 21579 21579
Misses 318 318 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
An alternative to this would be to write the names of all the passed and skipped tests to |
Another benefit would be that - if the reporting tool supports it - flaky tests could be traced. The file structure could be: "failed_tests": [
"extmod/vfs_blockdev_invalid.py",
"extmod/vfs_rom.py"
],
"successful_tests": [
"xy.py"
],
"skipped_tests": [] This would be my favorite - however, I am also fine with just the summary above. |
OK, I've now updated this PR to provide a full list of passed, skipped and failed tests in the |
I reviewed the MR and it look good to me. The complementary octoprobe commits are here: Testresult against this PR:
NOTE: The failed/skipped tests counters are still missing for
|
The output `_result.json` file generated by `run-tests.py` currently contains a list of failed tests. This commit adds to the output a list of passed and skipped tests, and so now provides full information about which tests were run and what their results were. Signed-off-by: Damien George <damien@micropython.org>
ea4d4c5
to
7a55cb6
Compare
Thanks for testing, the Octoprobe output summary is now looking a lot better!
Yes, that's the next thing to improve. |
Summary
This commit add some simple test statistics to the output
_result.json
file generated at the end of a test run.This will be useful for Octoprobe (automated hardware testing framework), to report a summary of the number of tests passed/skipped/failed.
Testing
Running a few tests on an esp8266:
The output
_result.json
is:EDIT: implementation now changed to output the entire list of passed/skipped/failed tests. The above output is now changed to:
Trade-offs and Alternatives
This is a simple implementation that overwrites the
_results.json
file each time. So if you run again with./run-tests.py --run-failures
the statistics will only reflect the rerun, not the original run. IMO that's acceptable (and Octoprobe doesn't use the--run-failures
feature).Eventually a similar thing needs to be added to the other test runners, eg
run-multitests.py
.